I have my resoruces stored in mysql DB:

eg :

|IPaddress | Status|
|10.x.x.x| yes|
|10.1.x.x| yes|
I am trying to get the available resource and after that will update DB with status No.

query = ("SELECT IPaddress from TABLE where status='yes'")
cursor = mydb.cursor()
cursor.execute(query)
result=cursor.fetchall()
query1 = ("update TABLE set Status = 'no' where IPaddress ='{}' ").format(result[0])
cursor.execute(query1)
mydb.commit()
But, my API will get called from multiple places simultaneously how i can ensure, the resource wont get allocate to different calls..

Recommended Answers

If you use InnoDB, it supports row level locking, which means that the individual row is locked during an INSERT command. If you use something like MyISAM, it uses table level locking, meaning the entire table is locked during an INSERT command of any single row, but it has the …

Jump to Post

All 2 Replies

Did you post more than once? I can clean that up for you.

I'll skip making a tutorial here but I recall something like this question during a SQL seminar. The person was new to SQL and couldn't get past this. Their story was they had written their own database system and were new to SQL. Moving on: Read the article at https://mysqlserverteam.com/innodb-data-locking-part-2-locks/

The short answer is:

So, let me just say that this table shows locks taken by the Server, and that they indeed prevent other clients from trying to modify the table:

Read the entire article to learn more why I never had to worry about this.

If you use InnoDB, it supports row level locking, which means that the individual row is locked during an INSERT command. If you use something like MyISAM, it uses table level locking, meaning the entire table is locked during an INSERT command of any single row, but it has the added benefit of the DELAYED keyword. When a client uses INSERT DELAYED, it gets an okay from the server at once, and the row is queued to be inserted when the table is not in use by any other thread.

That being said, MYSQL might not be the best fit if there are thousands of simultaneous inserts. I've found that MySQL works great for read-heavy databases, but not so much for write-heavy ones. At DaniWeb, we use Redis for API requests that require many simultaneous reads + writes, including handling our OAuth authentication and API tokens.

You might want to investigate if you're using the right tool for the job.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of 1.21 million developers, IT pros, digital marketers, and technology enthusiasts learning and sharing knowledge.