One of our customer's DB has become very slow. What's odd is that it is only happening to one customer.
We recently changed to 220.127.116.112. I've got their data file, about 650Mb, and am running a backup with our application. This reads all the records and serialises all the fields as strings in a file. When I attach VS to the running app and break, it's in a loop reading about 130,000 records and taking about 7 seconds per record. It is reading data correctly, but an order of magnitude more slowly than normal.
Do you have any idea what can happen to a DB to cause this behaviour?
For anyone else finding this thread, we were also working this issue via a ticket. The particular performance issue turned out to be due to a large table not having the indexes it was expected to have--it's not clear how it ended up that way--resulting in un-optimized scanning of a large table when targeting single rows (and so on).
Query structure and what indexes are available can have a significant impact on performance in some cases--especially as the database scales up. Joins and sub-queries can also have a multiplicative effect on query time if not structured carefully.
A few questions immediately come to mind for things which might affect performance (although I would not expect things to slow to that degree): Is the database file on a local disk for where the application is running, or is it accessed over a network share? Is the access mode (Open Mode) NonExclusive* or Exclusive* (or SharedReadOnly)? Does the table use any "(MAX)" types (or Image, Text, or NText)? What is the "page size" of the database file, and does the table contain any fields often containing data near or beyond that size (eg. VarChar or NVarChar with strings over 2000 characters long)?
Also, external effects should be checked on and ruled out: What is the memory load and limit on the machine (eg. from Task Manager's Performance tab), and what is the memory usage level for the application process (while running the backup)? When the application is running the backup, what is the CPU usage for the process and for the overall system? (And are any other processes using up a lot themselves?) What is the CPU usage for Visual Studio? Is it running extensions such as ReSharper which can sometimes use a lot of cycles when rescanning files?
To finish this off, the indexes were missing before the change in VistaDB version. The problem was fixed by recreating them in the Data Builder. I have no idea how they went missing though.
Thanks to VistaDB support for spotting the problem.