The Panorama Help file says that this can create a dictionary or data array. Can I chose which one? Are there any suggestions how I would learn how Panorama X uses jsonimport(JSON)?

My initial thought is that I would only use this to create variables. Just wondering where to start.

No, you cannot choose to create a dictionary or data array – that choice is made depending on what the top level of the JSON data is. If the top level is a dictionary, you’ll get a dictionary, if it is an array, you’ll get a data array. Unlike some other features like importjson, the jsonimport( function puts no restrictions on what sort of JSON structure it will accept. Whatever the structure of the JSON file is, you’ll get a mirror of that in your variable. And yes, you are correct, you would use this to create a variable.

Just wondering where to start

I created the JSON features to facilitate data exchange (both import and export) between Panorama and the numerous applications and web sites that use JSON. In general for a pure Panorama application, there would be no reason to use JSON. If you have an application you want to talk to that uses JSON, you would need to start by figuring out what the structure is of the JSON it generates and/or accepts.


For a given record with this content:

40,06/23/2016,007493 SCE (WT),RELOCATE POLES ON RAMONA AVE,36,948.22

PanoramaX exports this JSON:

  "Check total" : 36948.220000000001,
  "Check #" : "40",
  "Date" : "2016-06-23T00-00-00Z",
  "Vendor" : "007493 SCE (WT)"

What I want, however, is this:

["40","06/23/2016","007493 SCE (WT)","RELOCATE POLES ON RAMONA AVE","36,948.22" ]

Your answer makes me think I can’t get there from here. Should I continue to export to CSV and then head over to convertcsv.com? I’d love to be able to remove a couple steps from the process if possible.

P.S. Note that PanoramaX exports the fields out of order in JSON files, so I can’t even set up a chain of search-and-replaces to convert from PanoramaX JSON to the JSON array I need.

I may be wrong, but I don’t think what you are asking for is really JSON, but rather just a JavaScript array literal. In any case, you can turn the current record into a JavaScript array literal with this formula.

{[ "}+replace(exportline(),tab(),{","})+{" ]}

I tried this on a sample checkbook database and got this:

[ "01/01/17","101","Payroll","Blue Cross","Health Insurance Group Xbg-872","975.00","","31,852.78" ]

{[ "}+replace(exportline(),tab(),{","})+{" ]}

This will convert all fields to strings, but then the question didn’t have any other types.

It will also break when a text field contains tabstops.

How about this?

jsonexport(dataarray(fieldvalue(1), fieldvalue(2)))

which can be made dynamic using these parts:

for fn,1,info("fieldcount")

The original question did have a couple of numeric fields, and his desired output showed those fields converted to text in the output. If that is really what he wants, my code will do it.

As you point out my code won’t work with tabs (or binary data fields), but tabs are very very very unusual in Panorama databases, you have to really work to get a tab into a field.

It might be kind of nice if there was a function that would output the current record as a data array though. That would make your idea easier.

Your “dynamic” example, using dataarrayappend(, seems to be missing some code.

Thanks, guys. For some reason I hadn’t thought of creating another field to contain my export data. I do this all the time in FileMaker, but it never occurred to me while working in PanoramaX. D’oh!

Thanks for this. I don’t know the difference between a JSON array and a JSON array literal. At any rate, this formula works on the sample I gave, but doesn’t escape special characters, and doesn’t allow me to format the date and check amount fields. I forgot about the date field formatting and didn’t realize that JSON number field exports are not always just the number, but rather an approximation of the number to ten decimals points.

With a couple of tweaks, this worked for me, thusly:

jsonexport(dataarray(fieldvalue(1), datepattern(fieldvalue(2),"yyyy/mm/dd"), fieldvalue(3), fieldvalue(4), pattern(fieldvalue(5),"(#,.##)")))

I don’t know what “made dynamic” means, but thanks for your great answer.

FYI, you could have used the field names instead of fieldvalue(1), etc. That would probably be easier to read, and would still work if you later inserted additional fields. I don’t know your field names, but something like:

jsonexport(dataarray(«Check #», datepattern(Date,"yyyy/mm/dd"),Vendor,Description,pattern(«Check total»,"#,.##"))

The fieldvalue( function is handy when you don’t know the actual field names, or if you want to cycle thru the fields - 1, 2, 3, without worrying about what the fields names are. In this case since you know the names, it’s easier to just use the names.

Of course since you already have a working formula, there’s no real point in changing it, just keep using fieldvalue(. But I wanted to make this point for your potential future use and for anyone else that might read this thread later.

I’m a big fan of cargo-cult programming, and that’s the first code that worked for me. :slight_smile:

I’ve since moved the code out of a field and into a procedure, taking a different approach:

field «Check #»

local dataset,dataitem

arraybuild dataitem,¶,'',
str(«Check #»)+',"'+
pattern(«Check total»,"(#.,##)")+'"],'


filesave '',"warrants.json","TEXTR*ch",replace(dataset,"<DATA>",dataitem)

I should mention for the sake of posterity that this code is not “one size fits all.” It works in my application because I converted the quote marks (") in my data to other characters (there were only eight instances out of 36,000 records, so it was pretty low impact). There is no provision to escape special characters, and I manually have to replace ampersands (&) with an HTML entity so the final page will validate, delete the final comma, and add a header and footer. For good measure, I run the result against a JSON validator before posting it online.

I’m kind of surprised I never implemented a function to create javascript literals, as is available with perl, applescript, etc. So I’ve made a note to do this.

Nice, thanks. Just discovered the htmlencode() function. It seems very slow, but it does eliminate the search-and-replace step after export.

The htmlencode( function uses the characterfilter( function, which is implemented in Objective-C so it should be pretty fast. Here is what the htmlencode( function is doing:

characterfilter(thetext,|||switch(import(), {‘},{'}, {’},{'}, {”},{"}, {“},{"}, {&},{&amp;}, { },{&nbsp;}, {¡},{&#161;}, {¢},{&#162;}, {£},{&#163;}, {€},{&#164;}, {¥},{&#165;}, {|},{&#124;}, {§},{&#167;}, {¨},{&#168;}, {©},{&copy;}, {ª},{&#170;}, {«},{&#171;}, {¬},{&#172;}, {–},{&#173;}, {®},{&reg;}, {¯},{&#175;}, {˚},{&#176;}, {±},{&#177;}, {´},{&#180;}, {µ},{&#181;}, {¶},{&#182;}, {•},{&#183;}, {¸},{&#184;}, {º},{&#186;}, {»},{&#187;}, {¿},{&#191;}, {À},{&Agrave;}, {Á},{&Aacute;}, {Â},{&Acirc;}, {Ã},{&Atilde;}, {Ä},{&Auml;}, {Å},{&Aring;}, {Æ},{&AElig;}, {Ç},{&Ccedil;}, {È},{&Egrave;}, {É},{&Eacute;}, {Ê},{&Ecirc;}, {Ë},{&Euml;}, {Ì},{&Igrave;}, {Í},{&Iacute;}, {Î},{&Icirc;}, {Ï},{&Iuml;}, {Ñ},{&Ntilde;}, {Ò},{&Ograve;}, {Ó},{&Oacute;}, {Ô},{&Ocirc;}, {Õ},{&Otilde;}, {Ö},{&Ouml;}, {Ù},{&Ugrave;}, {Ú},{&Uacute;}, {Û},{&Ucirc;}, {Ü},{&Uuml;}, {ß},{&zslig;}, {à},{&agrave;}, {á},{&aacute;}, {â},{&acirc;}, {ã},{&atilde;}, {ä},{&auml;}, {å},{&aring;}, {æ},{&aelig;}, {ç},{&ccedil;}, {è},{&egrave;}, {é},{&eacute;}, {ê},{&ecirc;}, {ë},{&euml;}, {ì},{&igrave;}, {í},{&iacute;}, {î},{&icirc;}, {ï},{&iuml;}, {ñ},{&ntilde;}, {ò},{&ograve;}, {ó},{&oacute;}, {ô},{&ocirc;}, {õ},{&otilde;}, {ö},{&ouml;}, {÷},{&#247;}, {ù},{&ugrave;}, {ú},{&uacute;}, {û},{&ucirc;}, {ü},{&uuml;}, {ÿ},{&yuml;}, import())|||)

When I applied it to two of my five fields, it took a few minutes to generate my export file instead of a few seconds. When I applied it instead to the entire export, it never completed. It’s not a huge deal, though, as I can do this in BBEdit in about five seconds, and I have to open the file anyway to remove the comma from the end of the last line.

Wow, I would not have expected that. How big is the data being exported?

I’ve made a note to look into this further.

Around 36,000 records. I would be happy to supply the database if that would help.

For now, this works almost perfectly, except for the extra comma after the last record, but it’s specific to this database, so for anyone else using this, YMMV.

field «Check #»

local dataset,dataitem

arraybuild dataitem,¶,'',
str(«Check #»)+',"'+
pattern(«Check total»,"(#.,##)")+'"],'

let startCode = "{ ""data"": ["
let endCode = "]}"

filesave "warrants.json", startCode + replace(dataset,"<DATA>",dataitem) + endCode