database - Advice on data capture management system -


I am looking at the potential work to include quite a critical data capture and management site and am wondering how best Approach this. I am using an LAMP architecture.

The backend will be included in this:

User created a user through the uploaded CSV file - Before I'm confident about this part, I did it.

Once registered, the user will submit confidential information on a regular basis even though the data is confidential, I am planning to store it in a MySQL database only, which will be backed up on a regular basis.

There will be different roles:

  • An administrator who monitors user activity and creates new roles and user accounts where necessary.

  • End users - who will provide information through the form only, the ability to edit their profiles, past submissions and other basic stuff like

  • View management users - which can run a series of pre-defined reports on the data and display this information, they can also take "free-form queries" on the data within the browser as well as the selected W The Inam extract as spreadsheets. The number of these standard reports will also be made available in the form of web services / feeds. The free-form question part is a bit, because I still do not know what the data really is, plus there is a strong possibility that new questions will be dynamically added in the data capture form over time. The database structure has to be flexible to accommodate it, plus I have to provide this capability for free-form questions, which I have never done before, can anyone suggest a sensible approach to it?

There will also be a version of the version, so that a user updates / modifies some data, the changes will be tracked and the previous record (s) will still be available. It was planning to include it in database design, so instead of updating and overwriting data in the database, a new record will always be marked as "overwritten", which is being archived. In this way, I think I'm always able to retrieve the live data row, as well as the archived rows (organized by date). Does this make sense?

Thanks in advance for any hint, this is a bit more complicated than before I worked on (primarily standard CMS) - I think I know how to handle it , But

If you are going to use that version, you would be grateful for any advice . System, I suggest that you also create ideas for activating all records, and the user interface always uses those views (except where they are viewing the changes) directly from the table. I will also put the trigger on the table to ensure that only one record is marked as active then you have a unique key issue and the key to identifying the record in PK / FK relationships (you Can change every time, when you add another record).

Alternatively, you can establish audit tables if you need to research the old changes, then you have to go back to the old data. If you have to look at history, it may be hard to ask audit tables often because they are usually only old and new values, column names and some meta data such as changes and do not make by

Probably you may have a history table with some extra columns like a weird table (when the change was made and by which there was a new surrogate key) and Autombambing and The unique index is the original PK closed and the active records are all in one table, the passive records are in the other table (population by the trigger) and you only use the sequences when you want to see both sets of data together.

The key to all is to use the trigger to populate the history where you want to store it. Databases are not affected by user applications only, so you have to make sure that all the changes made in the database are not recorded only by the user interface or your interval in your history. I can not talk to all the databases but in SQL Server, once every batch of data begins to work, assuming that they do not update the price rows in a single query once, they can Make records. Writing triggers in set-based fashion is usually important, looking through the record is slow and your whole system can be stopped in a trigger, while you loop through a million records which increases the price of 10% .


Comments

Popular posts from this blog

sql - dynamically varied number of conditions in the 'where' statement using LINQ -

asp.net mvc - Dynamically Generated Ajax.BeginForm -

Debug on symbian -