Panorama X Causing Out of Application Memory Crash

Short version: Tried to Export the data in a troublesome file. Hung for awhile and then got a system error window that said “Your System has run out of application memory” and provided me with a list of apps to force quit. It showed Panorama X using 60.21 GB (yes GB) of memory for a 6.1MB file.

Does this mean corrupted data?

I got to this point when my assistant called two days ago and said she got the same message while trying to enter data. She didn’t save any of the information, so I didn’t have anything to work with. Came in the next day and did a reboot and a cache empty and went on with my work in different PanX files, which worked fine.

She called again that evening and the file was crashing when trying to do searches on the last name field, and only that field. Told her to give up and I’d deal with it this morning.

This morning I come in and try to open the file, but it keeps crashing on .initialize every time (no changes have been made that would effect .initialize before all of this started). Sent the file home to work on it there. Could get it to startup most of the time after I commented out some searches in the procedure, and then the same crashes would start. There was also a save procedure that would cause it to crash when it did some selecting. But not every time.

Decided to see if it was corrupted data and tried to export the data with the above result.

I would try two things:

  1. Try to open the troublesome file with the “data sheet only” option from the “Find & Open …” dialog. So you get your data. Maybe you are able to locate the corrupted data when no forms or procedures are involved.
  2. Duplicate the original file, open the copy and delete all records. So you get an empty database that keeps your procedures and forms. Then you can test if it works correctly with a reduced data set.

An open data sheet in a db with corrupted data can be pretty volatile, with a crash occurring as soon as any bad cell(s) are encountered. In some cases I have been able to identify the field with a formulafill <<>>. I’d get a crash as soon as I hit a field with bad data, but no changes for the rest. You can also try to loop through records to find the offender(s), then try to export around them. Sometimes a simple export of the data to a text file will either truncate at the problem area, or skip over the bad stuff. Good luck.

Ultimately fixed the problem by getting a Time Machine backup from just before the trouble started, exporting the data as text, import/replace into a copy of the bad file which solved the problem - which means I probably just could have used the backup, but wanted to make sure it was just bad data.