How to get all fields of the data group when mysql parses json data group
Introduction
During the development process, I encountered the string of json data group and needed to parse the json group to get all the information in the group. The following format:
[{"itemId":3101,"itemName":"空滤器及进气管道"},{"itemId":3102,"itemName":"水管、水泵"},{"itemId":3103,"itemName":"柴油管道"},{"itemId":3104,"itemName":"高压泵、机油泵"}]
Observe the json group and find that it is a list containing multiple json strings. What we need to do is to split out all the json strings in the list and parse each json string. .
After analysis, it was found that when the JSON string exists independently, it can be extracted using the JSON_EXTRACT method. Since the list contains multiple JSON strings, we need to split the list into multiple JSON strings.
Before studying the content of this article, you need to understand two functions of mysql in advance:
SUBSTRING_INDEX
JSON_EXTRACT
Step 1: Split one line into multiple lines
Split one line into multiple lines, that is, split the list into multiple lines of json, for this we need
1.1 Create a new table keyid, and only insert numbers starting from 0, as follows:
In other tutorials, it is also possible to use the help_topic_id field of the mysql.help_topic table. However, this library table requires root permissions before it can be used. Therefore, it is most appropriate to build your own matching table.
Note: The value of id cannot be less than the number of json strings in the list. For example, if there are 4 JSON strings in the above list, then the id value must be greater than 4. The maximum value of help_topic_id is 700. If the number of json strings in the list is greater than this value, it is inappropriate to use help_topic_id.
1.2 Find the split identifier
The so-called split identifier is a sign that can be split into multiple lines at one time based on this symbol. In the list below, the split identifier is not found, so it needs to be processed. You can think of ; as a split identifier. The processed content is as follows:
{"itemId":3101,"itemName":"空滤器及进气管道"};{"itemId":3102,"itemName":"水管、水泵"};{"itemId":3103,"itemName":"柴油管道"};{"itemId":3104,"itemName":"高压泵、机油泵"}
Remove the two list signs before and after [ and ], and change },{ into };{ so that ; can be turned into a split identifier. As follows
select replace(replace(replace(jsonarr,"},{","};{"),"]",""),"[","") as jsonarr from maptest
1.3 Split multiple rows by join on
At this time, we can connect the maptest table and the newly created keyid table by using the JOIN operation , matches multiple rows of data under the ON condition. Split via SUBSTRING_INDEX.
The code is as follows:
select a.jsonarr, SUBSTRING_INDEX( SUBSTRING_INDEX( a.jsonarr, ";", b.id + 1 ), ";",- 1 ) AS jsonarr_info, b.id from (select replace(replace(replace(jsonarr,"},{","};{"),"]",""),"[","") as jsonarr from maptest) a join keyid b on b.id< ( length( a.jsonarr ) - length( replace ( a.jsonarr, ";", "" ) ) + 1 ) ;
At this point, the work of splitting the json group into multiple lines is completed.
Second step: Parse the json string
After splitting into multiple lines, it can be parsed through JSON_EXTRACT. The effect is as follows:
The completed code is as follows:
select a.jsonarr, SUBSTRING_INDEX( SUBSTRING_INDEX( a.jsonarr, ";", b.id + 1 ), ";",- 1 ) AS jsonarr_info, b.id, JSON_EXTRACT(SUBSTRING_INDEX( SUBSTRING_INDEX( a.jsonarr, ";", b.id + 1 ), ";",- 1 ), '$[0].itemId') as itemId, replace(JSON_EXTRACT(SUBSTRING_INDEX( SUBSTRING_INDEX( a.jsonarr, ";", b.id + 1 ), ";",- 1 ), '$[0].itemName'),'"','') as itemName from (select replace(replace(replace(jsonarr,"},{","};{"),"]",""),"[","") as jsonarr from maptest) a join keyid b on b.id< ( length( a.jsonarr ) - length( replace ( a.jsonarr, ";", "" ) ) + 1 ) ;
Of course, it is also possible to use the help_topic_id field of the mysql.help_topic table. The code and results are as follows:
select a.jsonarr, SUBSTRING_INDEX( SUBSTRING_INDEX( a.jsonarr, ";", b.help_topic_id + 1 ), ";",- 1 ) AS jsonarr_info, b.help_topic_id, JSON_EXTRACT(SUBSTRING_INDEX( SUBSTRING_INDEX( a.jsonarr, ";", b.help_topic_id + 1 ), ";",- 1 ), '$[0].itemId') as itemId, replace(JSON_EXTRACT(SUBSTRING_INDEX( SUBSTRING_INDEX( a.jsonarr, ";", b.help_topic_id + 1 ), ";",- 1 ), '$[0].itemName'),'"','') as itemName from (select replace(replace(replace(jsonarr,"},{","};{"),"]",""),"[","") as jsonarr from maptest) a join mysql.help_topic b on b.help_topic_id < ( length( a.jsonarr ) - length( replace ( a.jsonarr, ";", "" ) ) + 1 ) ;
Note: If the field parsed by JSON_EXTRACT is a string, it will have "" double quotes. Just replace it. That’s it.
The above is the detailed content of How to get all fields of the data group when mysql parses json data group. For more information, please follow other related articles on the PHP Chinese website!

The steps for upgrading MySQL database include: 1. Backup the database, 2. Stop the current MySQL service, 3. Install the new version of MySQL, 4. Start the new version of MySQL service, 5. Recover the database. Compatibility issues are required during the upgrade process, and advanced tools such as PerconaToolkit can be used for testing and optimization.

MySQL backup policies include logical backup, physical backup, incremental backup, replication-based backup, and cloud backup. 1. Logical backup uses mysqldump to export database structure and data, which is suitable for small databases and version migrations. 2. Physical backups are fast and comprehensive by copying data files, but require database consistency. 3. Incremental backup uses binary logging to record changes, which is suitable for large databases. 4. Replication-based backup reduces the impact on the production system by backing up from the server. 5. Cloud backups such as AmazonRDS provide automation solutions, but costs and control need to be considered. When selecting a policy, database size, downtime tolerance, recovery time, and recovery point goals should be considered.

MySQLclusteringenhancesdatabaserobustnessandscalabilitybydistributingdataacrossmultiplenodes.ItusestheNDBenginefordatareplicationandfaulttolerance,ensuringhighavailability.Setupinvolvesconfiguringmanagement,data,andSQLnodes,withcarefulmonitoringandpe

Optimizing database schema design in MySQL can improve performance through the following steps: 1. Index optimization: Create indexes on common query columns, balancing the overhead of query and inserting updates. 2. Table structure optimization: Reduce data redundancy through normalization or anti-normalization and improve access efficiency. 3. Data type selection: Use appropriate data types, such as INT instead of VARCHAR, to reduce storage space. 4. Partitioning and sub-table: For large data volumes, use partitioning and sub-table to disperse data to improve query and maintenance efficiency.

TooptimizeMySQLperformance,followthesesteps:1)Implementproperindexingtospeedupqueries,2)UseEXPLAINtoanalyzeandoptimizequeryperformance,3)Adjustserverconfigurationsettingslikeinnodb_buffer_pool_sizeandmax_connections,4)Usepartitioningforlargetablestoi

MySQL functions can be used for data processing and calculation. 1. Basic usage includes string processing, date calculation and mathematical operations. 2. Advanced usage involves combining multiple functions to implement complex operations. 3. Performance optimization requires avoiding the use of functions in the WHERE clause and using GROUPBY and temporary tables.

Efficient methods for batch inserting data in MySQL include: 1. Using INSERTINTO...VALUES syntax, 2. Using LOADDATAINFILE command, 3. Using transaction processing, 4. Adjust batch size, 5. Disable indexing, 6. Using INSERTIGNORE or INSERT...ONDUPLICATEKEYUPDATE, these methods can significantly improve database operation efficiency.

In MySQL, add fields using ALTERTABLEtable_nameADDCOLUMNnew_columnVARCHAR(255)AFTERexisting_column, delete fields using ALTERTABLEtable_nameDROPCOLUMNcolumn_to_drop. When adding fields, you need to specify a location to optimize query performance and data structure; before deleting fields, you need to confirm that the operation is irreversible; modifying table structure using online DDL, backup data, test environment, and low-load time periods is performance optimization and best practice.


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Dreamweaver CS6
Visual web development tools

SAP NetWeaver Server Adapter for Eclipse
Integrate Eclipse with SAP NetWeaver application server.

PhpStorm Mac version
The latest (2018.2.1) professional PHP integrated development tool

Atom editor mac version download
The most popular open source editor

Safe Exam Browser
Safe Exam Browser is a secure browser environment for taking online exams securely. This software turns any computer into a secure workstation. It controls access to any utility and prevents students from using unauthorized resources.
