In pursuit of slowdowns

A shared database has been subject to severe slowdowns after it’s been in use for a short time. At startup it runs very nicely but the beachballs are always inevitable.

The file has about 800 fields (really) and about 50 forms. I’m not convinced those are the source of the slowdowns though. Several things have been found and improved but the slwodowns persist.

Today I discovered that the forms have loads of Text Editors set to pupup editing from their conversion from Pan 6.

I tried editing a form Blueprint to turn off popup editing but when saving, it takes a while before I get a message that I’ve run out of application memory.

Rather than manually clicking through each form to toggle each one off I’m trying to think of a way to automate a cruise through each form to do it.

It seems that there’s a way to select objects by type, but I haven’t found the fitting objectinfo. The idea would be to then use ChangeObjects to turn off the popup option.

If anyone has ideas, my ears are open.

Have you tried this?

local x
selectnoobjects
selectobjects objectinfo("$TextEditorPopUpEditing")
x=objectinfo("count")
if x=0 message "No objects found…" rtn endif
changeobjects "$TextEditorPopUpEditing","0"
message "Changed: "+pattern(x,"# object~")

I’m pretty sure that all you need is this one line.

changeobjects "$TextEditorPopUpEditing","0

I believe the changeobjects statement will ignore any object that this attribute doesn’t apply to, in other words, this will only affect Text Editor objects.

If you did want to select only text editor objects for some reason, the correct code is:

selectobjects obectinfo("class")="TextEditorObject"

There’s no reason to use selectnoobjects before this. However, you probably would want to use it after selecting and then changing objects.


However, I highly doubt that the popup editing option has anything to do with any performance issues.

Of course you know I’m going to say that 800 fields is way too many. So I won’t say it.

I don’t think 50 forms would be a problem at all, unless a large number of them were all open at once.

The one thing that is guaranteed to cause beachballs is using a form with a text list or matrix with the Database Navigation option turned on, and also having the data sheet open. Or having two or more forms open using text list or matrix. In these situations the different windows “fight” each other for control of the database and things mostly grind to a halt. Doesn’t matter if the database is shared or single user. The only solution is to have only one of these windows open at a time.

Wow - I’ve never heard of that. I wouldn’t have thought that was possible. Are you talking about the dialog that says you have to force quit applications?

Thanks to both of you for your suggestions. changeobjects "$TextEditorPopUpEditing","0" by itself did not change anything. It was necessary to select objects first.

Starting in a form window, I was able to use the following to loop through the 50 forms in less than a minute.

Let lvForms = dbinfo("Forms","")
Let lvThis = ""
LoopArray lvForms,cr(),lvThis
    GoForm lvThis
    selectobjects objectinfo("class")="TextEditorObject"
    changeobjects "$TextEditorPopUpEditing","0"
EndLoop

To find any TextListObjects or matrices, I altered it to

Let lvForms = dbinfo("Forms","")
Let lvThis = ""
LoopArray lvForms,cr(),lvThis
    Let x = ""
     GoForm lvThis
     selectobjects objectinfo("class") = "TextListObject"
     x=objectinfo("count")
     If x ≠ 0 message "Found: "+pattern(x,"# object~")
        Stop
     EndIf
EndLoop

These two short scripts may be useful to others.

There were none found aside from those in the test form I created to make sure those objects were found if they existed.

FWIW, besides the 800 fields, several of the 50 forms look like this:

The jury is out as to whether the elimination of popup editing will matter but some forms had a lot of them set.

If the slowdowns continue my next effort will be to break forms like the one pictured into 9 separate forms.

Yes, it required a Force Quit, but the form above may give you an idea of why changing and saving a new blueprint may have been problematic.

FWIW, I am the troubleshooter in this case and not the builder of these databases, although I am awed by the thought, effort, complexit,y and attention to detail that has been put into it.

Splitting the forms into more pages made only a very minor difference.

Further driving of the file suggests that the beachballs are triggered by every Save of the database. You can click through checkboxes, enter text, etc and everything is fine. Hit Save a there’s a delay of of almost 20 seconds before any further work can be done.

Editing and moving to a new record without saving gets the same effect, presumably because, as a shared file, the changes are being committed.

Does the saving delay make sense? And is there any way to accelerate the saving. At 87MB, this 180 record file has 831 fields, 149 procedures, and 50 forms.

I unshared the db, cut it to one record and reshared. Saving time was cut to about 3 seconds

The upload took a very long time; almost ten minutes which speaks to the fields, procedures and forms as the probable issue. Would paring one or the other of these make more difference than paring another?

I’m afraid you’re a test pilot on this. ProVUE has not done testing of shared databases with more than a couple hundred fields, and no more than a dozen or so forms. Not sure about the number of procedures - probaby dozens.

I can say that the answer to this question is no. Saving is saving.

That said, I am surprised by this sort of delay when saving. First of all, saving is a local operation. The data is saved to the local drive – it is not saved to the server. Also, the save operation does not take place on a “piece by piece” basis. Panorama doesn’t save individual fields, forms and procedures. Instead, all of the data is saved as a single contiguous set - it’s one single operation, not hundreds or thousands. The same goes for forms and procedures. The amount of time is linear based on the overal size. For 87MB, this should be less than a second.

Saving also will cause the current record to be committed (if it has been edited), so maybe the delay you are seeing has nothing to do with saving.

Of course committing does send data to the server. But again, it does not do so on a piece-by-piece basis. The current record is sent to the server as one piece. Even with 831 fields, its probably less than 100k of data.

That seems like a long time. Is this a very slow network connection? 87Mb isn’t really all THAT big. Also, Panorama compresses the file when it uploads it, so chances are the amount of data being pushed up is more like 40-50Mb. Unless the connection is very slow, I wouldn’t expect it to take 10 minutes. And once again, the upload is not piece by piece, it shouldn’t matter how many fields, procedures and forms - only the aggregate size should be a factor.


Where I would expect the number of fields to make a difference is in the user interface. It takes extra time for each field to be drawn, and that can add up.

As for forms, only the number of open forms will matter, and even then it will depend heavily on the number of elements on each form. So if you had one form open with a thousand items, that would be approximately the same performance wise as ten forms with a hundred items each. If a form isn’t open, it shouldn’t affect performance at all. In fact, when you open Panorama without any databases at all, there are currently 438 forms and 1,651 procedures loaded in memory, in the built in library databases included with Panorama. These have essentially no effect on performance.