Products

Solutions

Resources

Partners

Community

Blog

About

QA

Ideas Test

New Community Website

Ordinarily, you'd be at the right spot, but we've recently launched a brand new community website... For the community, by the community.

Yay... Take Me to the Community!

Welcome to the DNN Community Forums, your preferred source of online community support for all things related to DNN.
In order to participate you must be a registered DNNizen

HomeHomeArchived Discus...Archived Discus...Developing Under Previous Versions of .NETDeveloping Under Previous Versions of .NETASP.Net 2.0ASP.Net 2.0Complex Import / Export implementationComplex Import / Export implementation
Previous
 
Next
New Post
2/4/2009 8:26 AM
 

I'm in the process of developing a few large modules that have multiple related tables. It is a requirement for both that I can import and export all the data, i.e. export ALL the data from once instance of the module and import everything into another instance.

As I noted, much of this involves sets of data that will look like this:

TableA -> TableB <- TableC

TableA and TableC will have a identity key as their primary key. TableB will contain the integer keys to both A and C to form the dual relationship.

Therein lies the issue. I cannot use the identity keys since there's no guarentee they can be the same during the import process. The same is true for lookup tables. I may have a table for Addresses, and other for AddressTypes. The Address table will have the integer key pointing to the AddressType table that will translate that into Work, Home, etc.

Exporting that isn't as troublesome as it sounds. The export for Address will contain the textual resolved lookup value (Home, Work, etc). SO I'll Have something like:

<AddressTypes>
<AddressType>Home</AddressType>
<AddressType>Work</AddressType>
</AddressTypes>
<Addresses>
<Address>
<AddressType>Home</AddressType>
<Street>...</Street>
</Address>
</Addresses>

Obviously this is a vastly simplified illustration, but it shows the basic problem that shows up at the import stage. You have to import all the dependant tables first, such as address types. Those insertions will generate new integer keys to the lookup tables and somehow the trelated business objects need to obtain those value to maintain referential integrity.

There are several ways to do this, but I'd like feedback on the way that fits most appropraitely into the DNN Framework.

I could alter the controller so every time I insert an address if the addresstype key is null I call the lookup addresstype function by name and populate it. That is relatively clean and foolproof, but it introduced addition logic into the insert path for all transactions that may only be appropriate for imports. That also means the potential for additional databsae overhead on large imports, I could have 100,000 rows in some of these.

I could add the logic above to the insert stored procedure.

I could do that lookup during the import process, again calling the business controller to lookup the related values by name. Resolved the issue of having it in the insert path all the time, still have some overhead issues.

I could create a dictionary during the insertion to the tables that has the integer value indexed by the textual key and use those dictionaries to populate the integer values in the business object.

I could add code to the business object that if the textual name is populated and the lookup index is still null that I do a lookup and populate it.

They could be other means, but in the opinion of those here which is the most appropriate in the context of the framework.

 
New Post
2/4/2009 8:56 AM
 

I encountered the same problem with my ePrayer module which had a number of lookup tables for category, etc. I went the route of first serializing data from each secondary table using the text value (such as category name) rather than primary key. Likewise when exporting data from the primary tables, the text value rather than identity key for each row was serialized.

On import/deserialization of each secondary table as records were added to the database, I built a Dictionary(Of String, Integer) keyed on the text value with its new identity key as its value. Then, when importing the primary tables, the dictionaries for each secondary table were consulted to obtain the corresponding identity key for insertion into the record as it was being added to the database. I felt that the use of the dictionary approach would require the fiewest database accesses.

An even bigger problem was how to handle fields such as Author, ApprovedBy, LastUpdatedBy all of which contianed UserIds. I finally chose to export Usernames then during import resolved those back to UserId. If no corresponding Username was found in the Portal into which the module data was being imported, the UserId was set to -1 for the Anonymous user - not great but acceptible in the particular use case.

If you are interested, the source code of the ePrayer module is available on its project page on my site:

http://www.wesnetdesigns.com/Projects/ePrayerDNNModule/tabid/115/Default.aspx.

As this is a WSP project, the Install version also contains the source code. The implementation of IPortable is found in EPrayerController.vb


Bill, WESNet Designs
Team Lead - DotNetNuke Gallery Module Project (Not Actively Being Developed)
Extensions Forge Projects . . .
Current: UserExport, ContentDeJour, ePrayer, DNN NewsTicker, By Invitation
Coming Soon: FRBO-For Rent By Owner
 
New Post
2/4/2009 10:54 AM
 

 Thank you very much for the excellent example. Very well done module BTW.

 
New Post
2/5/2009 7:15 PM
 

 I wanted to mention your excellent example has sent me down a path that has considerable potential. I discussed the solution using serialization as your example does with a collegue and he pointed out I can create a "master" class that contains all the collections and serialize that in one pass. On import load the XMLdoc and deserialize back into that same class. That still means you have to plumb up the relationships but you're doing that with a set of collections.

I'm testing this now and will report back how it works. I feel it will fail in the high volume situations, but in others it does have promise.

 
Previous
 
Next
HomeHomeArchived Discus...Archived Discus...Developing Under Previous Versions of .NETDeveloping Under Previous Versions of .NETASP.Net 2.0ASP.Net 2.0Complex Import / Export implementationComplex Import / Export implementation


These Forums are dedicated to discussion of DNN Platform and Evoq Solutions.

For the benefit of the community and to protect the integrity of the ecosystem, please observe the following posting guidelines:

  1. No Advertising. This includes promotion of commercial and non-commercial products or services which are not directly related to DNN.
  2. No vendor trolling / poaching. If someone posts about a vendor issue, allow the vendor or other customers to respond. Any post that looks like trolling / poaching will be removed.
  3. Discussion or promotion of DNN Platform product releases under a different brand name are strictly prohibited.
  4. No Flaming or Trolling.
  5. No Profanity, Racism, or Prejudice.
  6. Site Moderators have the final word on approving / removing a thread or post or comment.
  7. English language posting only, please.
What is Liquid Content?
Find Out
What is Liquid Content?
Find Out
What is Liquid Content?
Find Out