I'm planing on creating a system that will share the same database over internet. There will be one core server where the database will be located and there will be a thousands of client programs that will modify the database simultaneously over internet. I've never done something like this before and I'm seeking for advice because I'm not too sure if this won't affect the system. The database has 10 tables and each table has almost 60 fields which will be empty but be filled by the clients as they access the data base.
How can I create this kind of server program and also how can I make sure that all the data that is sent by all the clients program are saved to the correct table and fields. I must say the clients will be installed in different places and there is 100% chance that these clients can send in data at the same time to a one table and field so how can I make a server program that can be able to handle all the sent in data at the same time so that the process can be faster and be 100% accurate. I think it will be wise to crypt the data on the client side before it is sent for security purposes but I think that will slower down the process because the server will be forced to decrypt before saving.
Any resources, suggestion, guide will be appreciated