I have an existing database design which I would like to normalize. Essentially, the table causing me problems is one with about 150 columns. The table holds live data for a natural gas well. So, there are columns for the various temperatures, pressures, etc. The data types of these columns vary which is what is causing me grief, but normalizing the table would allow additional data points to be added dynamically. My initial thought in normalizing was to create a table with (among other columns) a single column to hold the values for the temperatures, pressures, etc. The problem of course is the variety of data types. So, I would have to store them all in a varchar or something and cast them coming out with dynamic SQL. Given the possible performance issues related to casting everything & the drawbacks inherent in using dynamic SQL, I was hoping to find a better solution. Has anyone come across a similar situation and come up with a better solution?
Essentially, our existing table to hold live data for each well has roughly 150 columns. The issue we encounter is that each customer's needs for data gathered are a bit different. So a typical customer will only require 20 of those columns, but will need 5 more that we don't have. So, our predicament is that we must add new columns for each customer we encounter, and each customer does not use the majority of the existing columns. We are trying to come up with a design that does not require modifications to the database & supporting code for each new customer.
After some reading, it looks like this approach is called EAV. I've done some reading on EAV & there seems to be mixed thoughts. Many people say to avoid it, but it seems in some circumstances, it is the only way to accomplish the goal I've stated above. I've also read that a 6NF database is the way to truly do EAV correctly, but that seems pretty complicated and information is scarce.