简体   繁体   中英

Updating Database From Static File JSON Feed

I have a PHP script pulling a JSON file that is static and updates every 10 seconds. It has details about some events that happen and it just adds to the top of the JSON file. I then insert them into a MySQL database.

Because I have to pull every event every time I pull the file, I will only be inserting new events. The easy way would be to search for the event in the database (primary keys are not the same), but I am talking about ~4000 events every day, and I do not want that many queries just to see if it exists.

I am aware of INSERT IGNORE , but it looks like it only uses PRIMARY_KEY to do this.

What can I do (preferably easily) to prevent duplicates on two keys?

Example:

I have a table events with the following columns:

  • ID (irrelevant, really)
  • event_id (that I need to store from the source)
  • action_id (many action_ids belong to one event_id)
  • timestamp
  • whatever...

And my data is my JSON comes out on the first pull like this:

event_id|action_id|...
   1    |   1
   1    |   2
   1    |   3
   2    |   1
   2    |   2
   2    |   3

Then the next pull is this:

event_id|action_id|...
   1    |   1
   1    |   2
   1    |   3
   1**  |   4**
   1**  |   5**
   2    |   1
   2    |   2
   2    |   3
   2**  |   4**

I only want the rows marked with asterisks to be inserted, and the others to be ignored. Remember, primary_key column id is completely in this table, and I just use it for ubiquity.

What command can I use to "INSERT" every event I pull, but ONLY adding those that aren't duplicated by way of the two columns event_id and action_id .

Thanks.

Create a unique index of both columns.

CREATE 
    UNIQUE INDEX event_action
    ON tablename (event_id, action_id)  

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM