I'm in the process of developing a few large modules that have multiple related tables. It is a requirement for both that I can import and export all the data, i.e. export ALL the data from once instance of the module and import everything into another instance.
As I noted, much of this involves sets of data that will look like this:
TableA -> TableB <- TableC
TableA and TableC will have a identity key as their primary key. TableB will contain the integer keys to both A and C to form the dual relationship.
Therein lies the issue. I cannot use the identity keys since there's no guarentee they can be the same during the import process. The same is true for lookup tables. I may have a table for Addresses, and other for AddressTypes. The Address table will have the integer key pointing to the AddressType table that will translate that into Work, Home, etc.
Exporting that isn't as troublesome as it sounds. The export for Address will contain the textual resolved lookup value (Home, Work, etc). SO I'll Have something like:
<AddressTypes>
<AddressType>Home</AddressType>
<AddressType>Work</AddressType>
</AddressTypes>
<Addresses>
<Address>
<AddressType>Home</AddressType>
<Street>...</Street>
</Address>
</Addresses>
Obviously this is a vastly simplified illustration, but it shows the basic problem that shows up at the import stage. You have to import all the dependant tables first, such as address types. Those insertions will generate new integer keys to the lookup tables and somehow the trelated business objects need to obtain those value to maintain referential integrity.
There are several ways to do this, but I'd like feedback on the way that fits most appropraitely into the DNN Framework.
I could alter the controller so every time I insert an address if the addresstype key is null I call the lookup addresstype function by name and populate it. That is relatively clean and foolproof, but it introduced addition logic into the insert path for all transactions that may only be appropriate for imports. That also means the potential for additional databsae overhead on large imports, I could have 100,000 rows in some of these.
I could add the logic above to the insert stored procedure.
I could do that lookup during the import process, again calling the business controller to lookup the related values by name. Resolved the issue of having it in the insert path all the time, still have some overhead issues.
I could create a dictionary during the insertion to the tables that has the integer value indexed by the textual key and use those dictionaries to populate the integer values in the business object.
I could add code to the business object that if the textual name is populated and the lookup index is still null that I do a lookup and populate it.
They could be other means, but in the opinion of those here which is the most appropriate in the context of the framework.