Archive for the ‘JSON’ Category
MySQL Query from JSON
One of my students asked how you could get JSON data out in tabular format. I said they should look at Øystein Grøvlen’s JSON_TABLE – Best of Both Worlds blog post from 2018. Unfortunately, the student wanted another example with the Video Store model that we use in class.
For clarity, all path definitions start with a $ followed by other selectors:
- A period followed by a name, such as $.website
- [N] where N is the position in a zero-indexed array
- The .[*] wildcard evaluates all members of an object
- The [*] wildcard evaluates all members of an array
- The prefix and suffix wildcard, **, evaluates to all paths that begin with the named prefix and end with the named suffix
So, here’s a quick supplement to what’s already there. It assumes you created an example table based on my prior blog post that looks like this:
+----+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ | id | struct | +----+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ | 1 | {"contact": [{"last_name": "Winn", "first_name": "Randi"}, {"last_name": "Winn", "first_name": "Brian"}], "account_number": "US00001"} | | 2 | {"contact": [{"last_name": "Vizquel", "first_name": "Oscar"}, {"last_name": "Vizquel", "first_name": "Doreen"}], "account_number": "US00002"} | | 3 | {"contact": [{"last_name": "Sweeney", "first_name": "Meaghan"}, {"last_name": "Sweeney", "first_name": "Matthew"}, {"last_name": "Sweeney", "first_name": "Ian"}], "account_number": "US00003"} | | 4 | {"contact": [{"last_name": "Clinton", "first_name": "Goeffrey"}], "account_number": "US00004"} | | 5 | {"contact": [{"last_name": "Moss", "first_name": "Wendy"}], "account_number": "US00005"} | | 6 | {"contact": [{"last_name": "Gretelz", "first_name": "Simon"}], "account_number": "US00006"} | | 7 | {"contact": [{"last_name": "Royal", "first_name": "Elizabeth"}], "account_number": "US00007"} | | 8 | {"contact": [{"last_name": "Smith", "first_name": "Brian"}], "account_number": "US00008"} | | 9 | {"contact": [{"last_name": "Potter", "first_name": "Harry"}, {"last_name": "Potter", "first_name": "Ginny"}, {"last_name": "Potter", "first_name": "Lily"}], "account_number": "US00011"} | +----+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ 9 rows in set (0.01 sec) |
You can query the account_number key value like this:
SELECT id , JSON_EXTRACT(struct, "$.account_number") AS account_no FROM example; |
It returns:
+----+------------+ | id | account_no | +----+------------+ | 1 | "US00001" | | 2 | "US00002" | | 3 | "US00003" | | 4 | "US00004" | | 5 | "US00005" | | 6 | "US00006" | | 7 | "US00007" | | 8 | "US00008" | | 9 | "US00011" | +----+------------+ 9 rows in set (0.00 sec) |
You use the JSON_TABLE function to get the embedded array elements of first and last name, like:
SELECT id , contact.* FROM example CROSS JOIN JSON_TABLE( struct ,"$.contact[*]" COLUMNS( lname JSON PATH "$.last_name" , fname JSON PATH "$.first_name")) AS contact; |
It returns:
+----+-----------+-------------+ | id | lname | fname | +----+-----------+-------------+ | 1 | "Winn" | "Randi" | | 1 | "Winn" | "Brian" | | 2 | "Vizquel" | "Oscar" | | 2 | "Vizquel" | "Doreen" | | 3 | "Sweeney" | "Meaghan" | | 3 | "Sweeney" | "Matthew" | | 3 | "Sweeney" | "Ian" | | 4 | "Clinton" | "Goeffrey" | | 5 | "Moss" | "Wendy" | | 6 | "Gretelz" | "Simon" | | 7 | "Royal" | "Elizabeth" | | 8 | "Smith" | "Brian" | | 9 | "Potter" | "Harry" | | 9 | "Potter" | "Ginny" | | 9 | "Potter" | "Lily" | +----+-----------+-------------+ 15 rows in set (0.00 sec) |
You can combine both approaches, as shown below.
SELECT id , JSON_EXTRACT(struct, "$.account_number") AS account_no , contact.* FROM example CROSS JOIN JSON_TABLE( struct ,"$.contact[*]" COLUMNS( lname JSON PATH "$.last_name" , fname JSON PATH "$.first_name")) AS contact; |
It returns:
+----+------------+-----------+-------------+ | id | account_no | lname | fname | +----+------------+-----------+-------------+ | 1 | "US00001" | "Winn" | "Randi" | | 1 | "US00001" | "Winn" | "Brian" | | 2 | "US00002" | "Vizquel" | "Oscar" | | 2 | "US00002" | "Vizquel" | "Doreen" | | 3 | "US00003" | "Sweeney" | "Meaghan" | | 3 | "US00003" | "Sweeney" | "Matthew" | | 3 | "US00003" | "Sweeney" | "Ian" | | 4 | "US00004" | "Clinton" | "Goeffrey" | | 5 | "US00005" | "Moss" | "Wendy" | | 6 | "US00006" | "Gretelz" | "Simon" | | 7 | "US00007" | "Royal" | "Elizabeth" | | 8 | "US00008" | "Smith" | "Brian" | | 9 | "US00011" | "Potter" | "Harry" | | 9 | "US00011" | "Potter" | "Ginny" | | 9 | "US00011" | "Potter" | "Lily" | +----+------------+-----------+-------------+ 15 rows in set (0.00 sec) |
Lastly, if you want to get rid of the enclosing double quotes you can do the following:
WITH raw AS (SELECT id , JSON_EXTRACT(struct, "$.account_number") AS account_no , contact.* FROM example CROSS JOIN JSON_TABLE( struct ,"$.contact[*]" COLUMNS( lname JSON PATH "$.last_name" , fname JSON PATH "$.first_name")) AS contact) SELECT id , REGEXP_REPLACE(account_no,'"','') AS account_no , REGEXP_REPLACE(lname,'"','') AS lname , REGEXP_REPLACE(fname,'"','') AS fname FROM raw; |
It’s also possible to use the JSON_UNQUOTE function to cleanup the double quotes. I hope this helps those extracting JSON data into tabular result sets.
MySQL Backslashes
Yesterday, I wrote a blog post that showed you how to write a query returning a JSON structure for a 1:many relationship. The relationship was between the member and contact table. It returns one account_number from the member table and a list of first_name and last_name columns from the contact table in a JSON structure.
One of my students asked why I choose to strip the backslashes with Python, and my reply was the SQL was already complex for most blog readers. The student asked but how would you do it in SQL. OK, that’s a fair question for two reasons. First, you don’t need to do in your local programs because it’ll run faster on the server. Second, if you strip the backslashes you can insert it into a standard JSON column. This blog post will show you how to do both.
You would use three REGEXP_REPLACE function calls, like:
SELECT REGEXP_REPLACE( REGEXP_REPLACE( REGEXP_REPLACE( JSON_OBJECT( 'account_number', account_number ,'contact', CONCAT('[' , GROUP_CONCAT( JSON_OBJECT('first_name',first_name ,'last_name',last_name ) SEPARATOR ',') ,']') ) ,'\\\\','') ,'"\\\[','\\\[') ,'\\\]"','\\\]') AS json_result FROM member m INNER JOIN contact c ON m.member_id = c.member_id GROUP BY m.account_number; |
It returns the following:
+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ | {"contact": [{"last_name": "Winn", "first_name": "Randi"},{"last_name": "Winn", "first_name": "Brian"}], "account_number": "US00001"} | | {"contact": [{"last_name": "Vizquel", "first_name": "Oscar"},{"last_name": "Vizquel", "first_name": "Doreen"}], "account_number": "US00002"} | | {"contact": [{"last_name": "Sweeney", "first_name": "Meaghan"},{"last_name": "Sweeney", "first_name": "Matthew"},{"last_name": "Sweeney", "first_name": "Ian"}], "account_number": "US00003"} | | {"contact": [{"last_name": "Clinton", "first_name": "Goeffrey"}], "account_number": "US00004"} | | {"contact": [{"last_name": "Moss", "first_name": "Wendy"}], "account_number": "US00005"} | | {"contact": [{"last_name": "Gretelz", "first_name": "Simon"}], "account_number": "US00006"} | | {"contact": [{"last_name": "Royal", "first_name": "Elizabeth"}], "account_number": "US00007"} | | {"contact": [{"last_name": "Smith", "first_name": "Brian"}], "account_number": "US00008"} | | {"contact": [{"last_name": "Potter", "first_name": "Harry"},{"last_name": "Potter", "first_name": "Ginny"},{"last_name": "Potter", "first_name": "Lily"}], "account_number": "US00011"} | +-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ 9 rows in set (0.00 sec) |
Let’s create a table with a JSON structure with the following script:
/* Drop table if it exists. */ DROP TABLE IF EXISTS example; /* Create a example table. */ CREATE TABLE example ( id int unsigned auto_increment , struct json , PRIMARY KEY (id)); |
Now, we can embed the query inside an INSERT statement:
INSERT INTO example ( struct ) (SELECT REGEXP_REPLACE( REGEXP_REPLACE( REGEXP_REPLACE( JSON_OBJECT( 'account_number', account_number ,'contact', CONCAT('[' , GROUP_CONCAT( JSON_OBJECT('first_name',first_name ,'last_name',last_name ) SEPARATOR ',') ,']') ) ,'\\\\','') ,'"\\\[','\\\[') ,'\\\]"','\\\]') AS json_result FROM member m INNER JOIN contact c ON m.member_id = c.member_id GROUP BY m.account_number); |
A query of the example table, like:
SELECT * FROM example; |
Returns:
+----+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ | id | struct | +----+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ | 1 | {"contact": [{"last_name": "Winn", "first_name": "Randi"}, {"last_name": "Winn", "first_name": "Brian"}], "account_number": "US00001"} | | 2 | {"contact": [{"last_name": "Vizquel", "first_name": "Oscar"}, {"last_name": "Vizquel", "first_name": "Doreen"}], "account_number": "US00002"} | | 3 | {"contact": [{"last_name": "Sweeney", "first_name": "Meaghan"}, {"last_name": "Sweeney", "first_name": "Matthew"}, {"last_name": "Sweeney", "first_name": "Ian"}], "account_number": "US00003"} | | 4 | {"contact": [{"last_name": "Clinton", "first_name": "Goeffrey"}], "account_number": "US00004"} | | 5 | {"contact": [{"last_name": "Moss", "first_name": "Wendy"}], "account_number": "US00005"} | | 6 | {"contact": [{"last_name": "Gretelz", "first_name": "Simon"}], "account_number": "US00006"} | | 7 | {"contact": [{"last_name": "Royal", "first_name": "Elizabeth"}], "account_number": "US00007"} | | 8 | {"contact": [{"last_name": "Smith", "first_name": "Brian"}], "account_number": "US00008"} | | 9 | {"contact": [{"last_name": "Potter", "first_name": "Harry"}, {"last_name": "Potter", "first_name": "Ginny"}, {"last_name": "Potter", "first_name": "Lily"}], "account_number": "US00011"} | +----+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ 9 rows in set (0.00 sec) |
I hope this answers the question about whether you can use SQL remove the backslashes from the original result set and how you can insert the JSON result set into a JSON data type.
MySQL JSON Tricks
Are they really tricks or simply basic techniques combined to create a solution. Before writing these mechanics for using native MySQL to create a compound JSON object, let me point out that the easiest way to get one is to use the MySQL Node.js library, as shown recently in my “Is SQL Programming” blog post.
Moving data from a relational model output to a JSON structure isn’t as simple as a delimited list of columns in a SQL query. Let’s look at it in stages based on the MySQL Server 12.18.2 Functions that create JSON values.
Here’s how you return single row as a JSON object, which is quite straightforward:
SELECT JSON_OBJECT('first_name',c.first_name,'last_name',c.last_name) AS json_result FROM contact c WHERE first_name = 'Harry' AND last_name = 'Potter'; |
It returns:
+------------------------------------------------+ | json_result | +------------------------------------------------+ | {"last_name": "Potter", "first_name": "Harry"} | +------------------------------------------------+ 1 row in set (0.00 sec) |
With a GROUP_CONCAT function, let’s capture a JSON array of all three Potter family members:
SELECT CONCAT('[' , GROUP_CONCAT( JSON_OBJECT('first_name',first_name ,'last_name',last_name ) SEPARATOR ',') ,']') AS json_result FROM contact c WHERE c.last_name = 'Potter'; |
It returns an array of JSON objects:
+-----------------------------------------------------------------------------------------------------------------------------------------------+ | [{"last_name": "Potter", "first_name": "Harry"},{"last_name": "Potter", "first_name": "Ginny"},{"last_name": "Potter", "first_name": "Lily"}] | +-----------------------------------------------------------------------------------------------------------------------------------------------+ 1 row in set (0.01 sec) |
Next, let’s put a 1:many relationship between the member and contact table into a JSON structure with a single account number and an array of contact. It requires a second call to the JSON_OBJECT function and the addition of a GROUP BY clause in the query.
SELECT JSON_OBJECT( 'account_number', account_number ,'contact', CONCAT('[' , GROUP_CONCAT( JSON_OBJECT('first_name',first_name ,'last_name',last_name ) SEPARATOR ',') ,']') ) AS json_result FROM member m INNER JOIN contact c ON m.member_id = c.member_id WHERE c.last_name = 'Potter' GROUP BY m.account_number; |
It returns the following string with an annoying set of backslashes. It also inverts the column order, which appears unavoidable but it shouldn’t matter because the order of name-value pairs in JSON is immaterial.
+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ | {"contact": "[{\"last_name\": \"Potter\", \"first_name\": \"Harry\"},{\"last_name\": \"Potter\", \"first_name\": \"Ginny\"},{\"last_name\": \"Potter\", \"first_name\": \"Lily\"}]", "account_number": "US00011"} | +-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ 1 row in set (0.00 sec) |
The following quick little Python code cleans up the JSON string by removing the backslashes and extraneous quotes around the array of contacts.
# Import the library. import mysql.connector from mysql.connector import errorcode try: # Open connection. cnx = mysql.connector.connect(user='student', password='student', host='127.0.0.1', database='studentdb') # Create cursor. cursor = cnx.cursor() # Set the query statement. query = ("SELECT JSON_OBJECT( " "'account_number', m.account_number " ",'contact', CONCAT('[' " " , GROUP_CONCAT( " " JSON_OBJECT('first_name', c.first_name " " ,'last_name', c.last_name ) SEPARATOR ',') " " ,']')) AS json_result " "FROM contact c INNER JOIN member m " "ON c.member_id = m.member_id " "WHERE c.last_name = %s " "GROUP BY account_number") # Execute cursor. cursor.execute(query,["Potter"]) # Display the column returned by the query stripped of backslashes and # extraneous quotes. for (row) in cursor: for column in range(len(row)): print(row[column].replace("\\","").replace("\"[","[").replace("]\"","]")) # Close cursor. cursor.close() # ------------------------------------------------------------ # Handle exception and close connection. except mysql.connector.Error as e: if e.errno == errorcode.ER_ACCESS_DENIED_ERROR: print("Something is wrong with your user name or password") elif e.errno == errorcode.ER_BAD_DB_ERROR: print("Database does not exist") else: print("Error code:", e.errno) # error number print("SQLSTATE value:", e.sqlstate) # SQLSTATE value print("Error message:", e.msg) # error message # Close the connection when the try block completes. else: cnx.close() |
It returns:
{"contact": [{"last_name": "Potter", "first_name": "Harry"},{"last_name": "Potter", "first_name": "Ginny"},{"last_name": "Potter", "first_name": "Lily"}], "account_number": "US00011"} |
I hope this helps exhibit less well known MySQL syntax. Check this post to see how to insert a result set without Python as an intermediary.
Fun JavaScript Q?
I got sent a fun JavaScript question from a student. It was fun for two reasons. They were clearly engaged and trying to figure stuff out on their own and they were just beyond basic syntax issues.
The student wanted to know why this code worked:
// Populate an object with one property. var notempty = {test:"me"} // Test for and enumerate object properties. for (var property in notempty) { if (!notempty.hasOwnProperty(property)) { print("No properties") } else { print("[" + property + "]") } } print("done") |
and returned:
[test] done |
but the behavior changed when the initialized object didn’t have any properties. It failed to print the No properties message:
// Populate an object with one property. var empty = {} // Test for and enumerate object properties. for (var property in empty) { if (!empty.hasOwnProperty(property)) { print("No properties") } else { print("[" + property + "]") } } print("done") |
it printed:
done |
The answer was easy, there were no properties to enumerate, which meant the loop never executes. However, that’s only part of the answer. The complete answer is that the empty object and the default Object.prototype
have no properties to enumerate. You should perform a different test before the loop, like:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 | // Populate an object without properties. var empty = {} // Test for and enumerate object properties. if (Object.keys(empty).length !== 0) { for (var property in empty) { if (empty.hasOwnProperty(property)) { print("[" + property + "]") } } } else { print("No properties") } // Print complete message. print("done") |
If you run this from the mongo shell, like this
mongo --nodb --norc < object_properties.js |
It prints:
MongoDB shell version v4.0.19 No properties done bye |
If you run this inside the mongo shell with the load
function, like
load("object_properties.js") |
It prints:
No properties done true |
However, you can generate a parsing error when you rewrite line 11 in the if-else statement, like so
if (Object.keys(empty).length !== 0) { ... } else { ... } |
The reason for the error is that JavaScript appends a semicolon (;) at the end of each line. Therefore, the if-else-statement appears as an if-only-statement. which raises the syntax error.
It prints:
MongoDB shell version v4.0.19 2021-08-02T13:38:34.672-0600 E QUERY [js] SyntaxError: expected expression, got keyword 'else' @(shell):1:0 done bye |
It appears the parsing error is a warning message because the code completes. Personally, I would prefer that they not raise a warning message because it actually implies a style preference.
There’s a next level problem you may run into while trying to implement reflection, or the discovery of existing properties and their value types, of JavaScript objects. Here’s that code, which seems to be missing from many tutorials and articles:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 | // Create an object with functions. // Populate an object without properties. var notEmpty = { first_name: "Clark" , last_name: "Kent" , getName: function() { return this.first_name + " " + this.last_name } } // Test for and enumerate object properties. if (Object.keys(notEmpty).length !== 0) { for (var property in notEmpty) { if (notEmpty.hasOwnProperty(property)) { print("[" + property + "] value is a [" + typeof(notEmpty[property]) + "]") } } } else { print("No properties") } print("done") |
This code on line 12 lets you view a property’s value type:
1 | typeof(notEmpty[property]) |
Only the object[property]
subscript notation method works when the property’s value is a variable. The other dot notation syntax to access a property’s value in JavaScript, object.property
, doesn’t work without a literal value. Actually, it would work if you embed it inside an eval() function call, but the eval() function is bad juju (black magic) in JavaScript coding.
It prints the following:
MongoDB shell version v4.0.19 [first_name] value is a [string] [last_name] value is a [string] [getName] value is a [function] done bye |
While this shows the basic functionality, it would be more effective to create a function and extend it while adding some output formatting. The next example wouldn’t be found in a JSON (JavaScript Object Notation) collection element but would be found in a NodeJS application managing JSON structures. You would define the constructor to match the data properties of the JSON structure.
The nice thing about a JavaScript function is that “It can specify a list of parameters that will act as variables initialized by the invocation arguments. The body of the function includes variable definitions and statement. (JavaScript: The Good Parts, 2008)” Newer approaches allow you to pass an object as a parameter with named notation inside the object.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 | // Create an object with functions. function Person(first_name, last_name) { this.first_name = first_name , this.last_name = last_name , this.setFirstName = function(first_name) { this.first_name = first_name } , this.getName = function() { return this.first_name + " " + this.last_name } } // Create an instance of a variable and print results. var myPerson = new Person("Clark","Kent") print("Initial Name: " + myPerson.getName()) // Call the added function and print the new value. myPerson.setFirstName("Lois") print("Reset Name: " + myPerson.getName()) print() // Test for and enumerate object properties. print("Variable Property Names & Values") print("====================================") if (Object.keys(myPerson).length !== 0) { for (var property in myPerson) { if (myPerson.hasOwnProperty(property)) { print("[" + property + "] value is a [" + typeof(myPerson[property]) + "]") } } } else { print("No properties") } print("====================================") print("done") |
It prints the following:
MongoDB shell version v4.0.19 Initial Name: Clark Kent Reset Name: Lois Kent Variable Property Names & Values ==================================== [first_name] value is a [string] [last_name] value is a [string] [setFirstName] value is a [function] [getName] value is a [function] ==================================== done bye |
While the prior example shows object construction, this one introduces one variable inheriting the Person function’s properties and another inheriting the same properties before adding a new one to only the variable. The inherited properties, which are functions, are methods of the variable by delegation.
Below is the modified code. It changes the variable from myPerson to myPerson1 and lines 22 through 28 add the new myPerson2 variable and setLastName function. Lines 47 to 58 enumerate across the new second variable, myPerson2 to show you that it has a new setLastName function.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 | // Create an object with functions. function Person(first_name, last_name) { this.first_name = first_name , this.last_name = last_name , this.setFirstName = function(first_name) { this.first_name = first_name } , this.getName = function() { return this.first_name + " " + this.last_name } } // Create an instance of a variable and print results. var myPerson1 = new Person("Clark","Kent") print("Initial Name: " + myPerson1.getName()) // Call the added function and print the new value. myPerson1.setFirstName("Lois") print("Reset Name: " + myPerson1.getName()) print() // Create an instance of a variable and print results. var myPerson2 = new Person("Bruce","Wayne") print("Initial Name: " + myPerson2.getName()) // Add a function to the local variable. function setLastName(last_name) { this.last_name = last_name } myPerson2.setLastName = setLastName print() // Test for and enumerate object properties. print("Variable Property Names & Values") print("====================================") if (Object.keys(myPerson1).length !== 0) { for (var property in myPerson1) { if (myPerson1.hasOwnProperty(property)) { print("[" + property + "] value is a [" + typeof(myPerson1[property]) + "]") } } } else { print("No properties") } print("====================================") print() // Test for and enumerate object properties. print("Variable Property Names & Values") print("====================================") if (Object.keys(myPerson2).length !== 0) { for (var property in myPerson2) { if (myPerson2.hasOwnProperty(property)) { print("[" + property + "] value is a [" + typeof(myPerson2[property]) + "]") } } } else { print("No properties") } print("====================================") print("done") |
The program shows you the following output:
MongoDB shell version v4.0.19 Initial Name: Clark Kent Reset Name: Lois Kent Initial Name: Bruce Wayne Variable Property Names & Values ==================================== [first_name] value is a [string] [last_name] value is a [string] [setFirstName] value is a [function] [getName] value is a [function] ==================================== Variable Property Names & Values ==================================== [first_name] value is a [string] [last_name] value is a [string] [setFirstName] value is a [function] [getName] value is a [function] [setLastName] value is a [function] ==================================== done bye |
As always, I hope this helps those working through a similar issue.
MongoDB Script Test
There are many ways to test and edit files. A lot of developers only use their favorite Integrated Developer Environment (IDE) but I find testing script files within the scope of a pipelined set of scripts much faster.
The ability to edit a JavaScript file from within the mongo
Shell would be nice but unfortunately, it doesn’t exist. You are able to edit a complex variable with a mechanism quite like the Oracle Database. Rather than leave you hanging on how to edit a complex variable in the mongo
shell, here are the steps before launching into how to test your JavaScript files:
- You can enter this manually during any connection to the
mongo
shell or put it in your.mongorc.js
configuration file, like this in Fedora with the fully qualified filename:EDITOR="/usr/bin/vim"
- Let’s say you have a
stooges
array variable that contains “Moe”, “Curly”, and “Larry” and you want to add “Shemp” to the variable. You can edit thestooges
array variable withvi
and add “Shemp” to it by typing edit and the stooges variable name.edit stooges
- Then, you can test the
stooges
array variable’s new values:stooges
- It returns the following:
[ "Moe", "Curly", "Shemp", "Larry" ]
Unfortunately, these changes to the demo array variable will be lost after you break the connection. While it does afford a quick test case, you should make the changes in the external JavaScript file. Then, the change is there the the next time you access the resource file.
Here’s my quick edit and test script technique for MongoDB from your present working directory:
- Assume you create a
compliment.js
test file, like:/* Declare an array variable and random index value. */ var compliment = ["Bashful","Doc","Dopey","Grumpy","Happy","Sleepy","Sneezy"] var index = Math.floor(Math.random()*7) /* Print a welcome message. */ print("Hello, " + compliment[index] + ".")
- You can edit the
compliment.js
file withvi
and test the script interactively from the present working directory.- You can edit the file with the following syntax:
vi compliment.js
-
then, you can test the
task.sql
file:mongo --nodb --norc < compliment.js
- The
--nodb
option instructs MongoDB to not connect to a database and the--norc
option instructs MongoDB to not load your.mongorc.js
file. Effectively, disabling the database connection and loading of the MongoDB resource file (.mongorc.js) lets you test your code in MongoDB’s Javascript shell unencumbered by any overhead from the MongoDB server.The foregoing script returns the following:
MongoDB shell version v4.0.19 Hello, Sneezy. bye
- If you have the desired outcome, you’re done. However, if you need further change you repeat the process.
- You can edit the file with the following syntax:
As always, I hope this helps those looking for a quick way to accomplish a task.
PostgreSQL Write JSON File
In the process of working through issues to support ETL transformations from JSON with Python. The first step was extracting a the data from a PostgreSQL table into a JSON file. The syntax wasn’t exactly clear and the PostgreSQL Tutorial was a bit misleading but I worked it out. My initial sample deals with writing the data from the item
table of my videodb
database to a item.json
file in the /tmp
directory.
The query retrieves the data but casts the creation_date
and last_update_date
timestamp
data types into date
data types, as follows:
SELECT row_to_json( (SELECT ROW FROM (SELECT item_id , item_barcode , item_type , item_title , item_subtitle , item_rating , item_release_date , created_by , creation_date::DATE , last_updated_by , last_update_date::DATE) ROW)) AS DATA FROM item; |
The internal SELECT
-list creates the name element of name:value pairs in the JSON structure. Naturally, it does exclude the type casting ::date
portion of the query from the name elements. The row_to_json()
maps the name
list to the list of value
results from in each row, and returns a result set.
The query returns the following 21 JSON objects:
{"item_id":1001,"item_barcode":"9736-05640-4","item_type":1016,"item_title":"The Hunt for Red October","item_subtitle":"Special Collector's Edition","item_rating":"PG","item_release_date":"1990-03-02","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1002,"item_barcode":"24543-02392","item_type":1016,"item_title":"Star Wars I","item_subtitle":"Phantom Menace","item_rating":"PG","item_release_date":"1999-05-04","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1003,"item_barcode":"24543-5615","item_type":1015,"item_title":"Star Wars II","item_subtitle":"Attack of the Clones","item_rating":"PG","item_release_date":"2002-05-16","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1004,"item_barcode":"24543-05539","item_type":1016,"item_title":"Star Wars II","item_subtitle":"Attack of the Clones","item_rating":"PG","item_release_date":"2002-05-16","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1005,"item_barcode":"24543-20309","item_type":1016,"item_title":"Star Wars III","item_subtitle":"Revenge of the Sith","item_rating":"PG13","item_release_date":"2005-05-19","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1006,"item_barcode":"86936-70380","item_type":1016,"item_title":"The Chronicles of Narnia","item_subtitle":"The Lion, the Witch and the Wardrobe","item_rating":"PG","item_release_date":"2002-05-16","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1007,"item_barcode":"91493-06475","item_type":1019,"item_title":"RoboCop","item_subtitle":"","item_rating":"Mature","item_release_date":"2003-07-24","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1008,"item_barcode":"93155-11810","item_type":1019,"item_title":"Pirates of the Caribbean","item_subtitle":"","item_rating":"Teen","item_release_date":"2003-06-30","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1009,"item_barcode":"12725-00173","item_type":1019,"item_title":"The Chronicles of Narnia","item_subtitle":"The Lion, the Witch and the Wardrobe","item_rating":"Everyone","item_release_date":"2003-06-30","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1010,"item_barcode":"45496-96128","item_type":1017,"item_title":"MarioKart","item_subtitle":"Double Dash","item_rating":"Everyone","item_release_date":"2003-11-17","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1011,"item_barcode":"08888-32214","item_type":1018,"item_title":"Splinter Cell","item_subtitle":"Chaos Theory","item_rating":"Teen","item_release_date":"2003-04-08","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1012,"item_barcode":"14633-14821","item_type":1018,"item_title":"Need for Speed","item_subtitle":"Most Wanted","item_rating":"Everyone","item_release_date":"2004-11-15","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1013,"item_barcode":"10425-29944","item_type":1019,"item_title":"The DaVinci Code","item_subtitle":"","item_rating":"Teen","item_release_date":"2006-05-19","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1014,"item_barcode":"52919-52057","item_type":1019,"item_title":"Cars","item_subtitle":"","item_rating":"Everyone","item_release_date":"2006-04-28","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1015,"item_barcode":"9689-80547-3","item_type":1020,"item_title":"Beau Geste","item_subtitle":"","item_rating":"PG","item_release_date":"1992-03-01","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1016,"item_barcode":"53939-64103","item_type":1020,"item_title":"I Remember Mama","item_subtitle":"","item_rating":"NR","item_release_date":"1998-01-05","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1017,"item_barcode":"24543-01292","item_type":1020,"item_title":"Tora! Tora! Tora!","item_subtitle":"The Attack on Pearl Harbor","item_rating":"G","item_release_date":"1999-11-02","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1018,"item_barcode":"43396-60047","item_type":1020,"item_title":"A Man for All Seasons","item_subtitle":"","item_rating":"G","item_release_date":"1994-06-28","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1019,"item_barcode":"43396-70603","item_type":1020,"item_title":"Hook","item_subtitle":"","item_rating":"PG","item_release_date":"1991-12-11","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1020,"item_barcode":"85391-13213","item_type":1020,"item_title":"Around the World in 80 Days","item_subtitle":"","item_rating":"G","item_release_date":"1992-12-04","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} {"item_id":1021,"item_barcode":"85391-10843","item_type":1020,"item_title":"Camelot","item_subtitle":"","item_rating":"G","item_release_date":"1998-05-15","created_by":1001,"creation_date":"2019-12-13","last_updated_by":1001,"last_update_date":"2019-12-13"} |
Moving on to the next step you enclose the query in the copy
command syntax, like:
COPY (SELECT row_to_json( (SELECT ROW FROM (SELECT item_id , item_barcode , item_type , item_title , item_subtitle , item_rating , item_release_date , created_by , creation_date::DATE , last_updated_by , last_update_date::DATE) ROW)) AS DATA FROM item) TO '/tmp/sample.json'; |
When you run it, it will fail with the following error message:
ERROR: must be superuser or a member of the pg_write_server_files role to COPY to a file HINT: Anyone can COPY to stdout or from stdin. psql's \copy command also works for anyone. |
You can fix this problem by connecting as the postgres
user and granting the privilege to the student
user:
GRANT pg_write_server_files TO student; |
It then runs successfully as the student
user and creates the sample.json
file in the /tmp
directory. As always, I hope this helps those looking for a solution.
Convert JSON with PHP
Sometimes I get poorly thought or just naive questions. That is naive questions because they didn’t read the documentation or don’t understand the semantics of a given programming language. The question this time said they tried to implement what they found on a web page but their sample json_decode
function example failed after they followed directions.
Surprise, it didn’t fail because they followed directions. They overlooked part of the example because they didn’t understand how to read a nested array in PHP. The actual example sets up an array of JSON objects, then print_r
to read the Array
, but the student tried to read it in a foreach
loop. Naturally, their sample program raised an error because the base object was an Array
not a String
, and their target JSON object was nested inside the base Array
.
I rewrote the example file to simply convert a JSON structure to an associative array, as follow:
<?php // Assign a JSON object to a variable. $someJSON = '{"name":"Joe","moniker":"Falchetto"}'; // Convert the JSON to an associative array. $someArray = json_decode($someJSON, true); // Read the elements of the associative array. foreach ($someArray as $key =--> $value) { echo "[" . $key . "][" . $value . "]"; } ?> |
When you call the program, like this
php test.php |
It displays
[name][Joe][moniker][Falchetto] |
As always, I hope this helps those looking to display a JSON structure in PHP.
Ruby GEM Mongo
While trying to use the Ruby gem
utility to install the MongoDB gem, I encountered an error on a new Fedora 27 instance. This is the error message:
Fetching: bson-4.3.0.gem (100%) Building native extensions. This could take a while... ERROR: Error installing mongo: ERROR: Failed to build gem native extension. current directory: /usr/local/share/gems/gems/bson-4.3.0/ext/bson /usr/bin/ruby -r ./siteconf20180517-49401-1htl7zc.rb extconf.rb mkmf.rb can't find header files for ruby at /usr/share/include/ruby.h extconf failed, exit code 1 Gem files will remain installed in /usr/local/share/gems/gems/bson-4.3.0 for inspection. Results logged to /usr/local/lib64/gems/ruby/bson-4.3.0/gem_make.out [student@localhost ~]$ ruby --version ruby 2.4.3p205 (2017-12-14 revision 61247) [x86_64-linux] |
There wasn’t much on the error but I checked the Ruby installation and ruby-devel
package wasn’t installed by default. That’s odd since I choose to install the development components on the workstation.
Not a problem, I simply ran the yum utility as root
through a sudoer user to install the ruby-devel package:
yum install -y ruby-devel |
You should see a successful installation log like:
Display ruby-devel
Package Log File →
Last metadata expiration check: 2:44:39 ago on Thu 17 May 2018 11:01:10 AM MDT. Dependencies resolved. ================================================================================================ Package Arch Version Repository Size ================================================================================================ Installing: ruby-devel x86_64 2.4.3-87.fc27 updates 119 k Transaction Summary ================================================================================================ Install 1 Package Total download size: 119 k Installed size: 283 k Downloading Packages: ruby-devel-2.4.3-87.fc27.x86_64.rpm 269 kB/s | 119 kB 00:00 ------------------------------------------------------------------------------------------------ Total 118 kB/s | 119 kB 00:01 Running transaction check Transaction check succeeded. Running transaction test Transaction test succeeded. Running transaction Preparing : 1/1 Installing : ruby-devel-2.4.3-87.fc27.x86_64 1/1 Verifying : ruby-devel-2.4.3-87.fc27.x86_64 1/1 Installed: ruby-devel.x86_64 2.4.3-87.fc27 Complete! |
As I suspected, it fixed the problem immediately when I ran gem
utility to install the mongo
gem. The syntax to install the mongo
gem is:
gem install mongo |
The console output should be:
Building native extensions. This could take a while... Successfully installed bson-4.3.0 Fetching: mongo-2.5.3.gem (100%) Successfully installed mongo-2.5.3 Parsing documentation for bson-4.3.0 Installing ri documentation for bson-4.3.0 Parsing documentation for mongo-2.5.3 Installing ri documentation for mongo-2.5.3 Done installing documentation for bson, mongo after 3 seconds 2 gems installed |
You can now write and run a Ruby test MongoDB connection program. Here’s a basic MongoDBTest.rb
Ruby file:
#!/usr/bin/ruby # Require libraries. require 'rubygems' require 'mongo' # Create a new connection to MongoDB. $client = Mongo::Client.new(['localhost']) puts 'Connected to MongoDB' puts '====================' puts 'Database Names:' puts '---------------' $client.database_names.each{|name| puts name} |
After you create the MongoDBTest.rb
file, you need to change its permissions with the chmod
utility as follows:
chmod 755 MongoDBTest.rb |
Then, you can run it as follows from the directory where you created the file:
./MongoDBTest.rb |
Unless you’ve added something to the base MongoDB instance, it should print:
Connected to MongoDB ==================== Database Names: --------------- admin local |
As always, I hope this helps somebody looking for a straightforward example and solution.
MySQL JSON Functions
What the MySQL team is doing with JSON (JavaScript Object Notation) in MySQL 5.7 is great! The MySQL Server Blog (Rick Hillegas and Dag Wanvik) published two key articles about new JSON functions. If you don’t follow these, let me highlight them as a set:
Most folks know how important JSON is to web development. I like the following visual that highlights it. It was provided as a comment to this earlier Popular Programming Language post by Michael Farmer. Clearly, JavaScript is popular because it’s critical to effective web development. If you’re new to JSON, check out Adam Khoury’s JSON tutorial set on YouTube.
If you want the original graphic, you can find it here. It’s always to hard to keep up with the technology, isn’t it? 🙂
Hobbyist at Heart
Fifteen years ago when I dove into JavaScript friends called me a hobbyist and said I was wasting my time. They had some basis in what they said because most of my work is server side with the database or operating system. However, they haven’t said that I wasted my time with the advent of NoSQL solutions, like Cassandra and MongoDB; or frameworks like node.js. Now writing JavaScript is mainstream and an invaluable skill set, and they’re trying to catch up. While David Flanagan’s JavaScript: The Definitive Guide is comprehensive, I recommend my students start with David McFarland’s JavaScript & jQuery: The Missing Manual. I also strongly suggest they use jQuery first and write JavaScript when jQuery isn’t suitable.
I really liked David McFarland’s CSS: The Missing Manual when it came out in 2009, and I’ve recommended it for several years. However, he’s got a new version – CSS3: The Missing Manual that came out in late January 2014. It’s an improvement over his first volume and I’d recommend you upgrade if you’re writing, modifying, or maintaining Cascading Style Sheet or if you just want to learn more about CSS.
Fortunately for me, CSS3: The Missing Manual is available through iTunes for Apple users, Naturally, it’s also available on Safari and Kindle formats. As an Apple user, I opted for the iBook format for my iPad Air. Unfortunately, it’s $27.99 as an iBook compared to $15.49 on Kindle, and that almost makes me opt to use the Kindle App. 😉