Products

Solutions

Resources

Partners

Community

Blog

About

QA

Ideas Test

New Community Website

Ordinarily, you'd be at the right spot, but we've recently launched a brand new community website... For the community, by the community.

Yay... Take Me to the Community!

Welcome to the DNN Community Forums, your preferred source of online community support for all things related to DNN.
In order to participate you must be a registered DNNizen

HomeHomeOur CommunityOur CommunityGeneral Discuss...General Discuss...DotNetNuke Speed - LetDotNetNuke Speed - Let's be honest
Previous
 
Next
New Post
7/25/2006 5:18 AM
 

I have been using DNN for a number of years now and have a lot of Clients running DNN, to say I love DNN is an understatement, it makes life so much easier, but the one problem I also constantly face is trying to optimize speed, I know DNN has a well planned road map as far as development is concerned, but maybe during the next development cycle special attention should be given to Performance, maybe even creating a sub project with the sole responsibility of addressing all performance issues, both in the core and in the core modules.

 

This is just of the top of my head, but such a team could take the core, and each module, find all the performance bottlenecks, address them and then in doing so produce a performance guideline document for the rest of the teams to follow with regards to performance optimisation of the code they write.

 


"Life's journey is not to arrive at the end safely in a well preserved body, but rather to slide in sideways totally worn out shouting, "holy sh*t...what a ride!"
Dragon's Den
 
New Post
7/25/2006 12:17 PM
 

It always makes me wince when I see core or any developer always assume that .NET caching methology are always the best method of optimizing an application.

Over the years, I've been involved with some of the largest intranet applications developed here in Canada - one is still arguably probably one of the largest running (burst transactions  / pages per hour in the 1M per day and 25M / day respectively, > 10,000 concurrent users) the application had to be developed for the burst high threshold volume simply because that's when they make their money.  Caching was never the issue, nor consideration - it was the well thought out applications development and coding required to insure that bottlenecks would not happen.  And no, we didn't do what everyone touts as "great" and fast dnn sites, we didn't toss a cluster of servers at it.  That's poor application design that can't handle a medium sized website, and you have to continually throw more hardware at it.

Application tuning first and formost comes with the application or core API set.  .NET caching under most cases will objectively give you some improvements, but it quickly moves a bottlenck downlevel into .NET code.  For years Microsoft has been telling us that SQL Server caching alogorthims are some of the best, and it's TPS rating indicates such.  Why double cache?  Is the prevalent theory that .NET CLR code execution is faster than tight, tight C++ code that SQL Server uses?  Caching on the code execution side, should always be done carefully and abstracted at the right layers - caching data is rather silly, when it's already cached in SQL Server anyways.  Caching uplevel after complex business logic - more beneficial.  Rule #1 of performance optimization - tune the application first, cache if you have no other choice or don't have the $$ or time to do the first part.

Objectively the code needs to be anaylzed and proper coding methodologies in place for code level optimization.  ascx controls replaced by pure server controls, and the entire framework vetted for wasteful lines of code execution, and there is alot of that in core code.

Performance hurts everyone whether you host the sites (less sites / system), developer (I would estimate we have a loss of well over 20% in productivity in development using the DNN 3 code bases from the DNN 2 code bases), or end user (general frustration and loss of satisfaction), and really goes completely away from what DotNetNuke's original theme was - a phpnuke replacement. 

As being someone that has invested, literally 10's of thousands of hours on DNN module code, I find the trend of performance very worrisome.  Last time I checked, phpnuke could run easily 100's if not 1000's of sites on a single server.  This more than features, will spell a decline in DNN interest over time.  Features are nice, but really - under most cases, with the exception of ML sites, most users that we've had are quite happy with DNN 2.1.2. 

If there wasn't so much variation in all the dnn code bases now, we'd move back our development into DNN 2, however, with so many fundamental changes, and things that don't work that should work from the API level - we're forced with developing under DNN 3.x. However, we implement sites based upon the following theory - fast sites, still will use DNN 2.1.2, sites needing more enhanced features, ie: ML then we use DNN 3/4.

Cheers,

Richard

 
New Post
7/25/2006 3:04 PM
 

the core code is complex and has many redundant calls in it...

it is slow..  that is a fact..

put a few modules on a page, load up the code in VS ...  put a break on IsInRole  and run it

I probably can find 100 exact scenarios like that...    complete redundant calls ..  add that up for one page and you get a lot of redundant work happening...

 
New Post
7/25/2006 4:48 PM
 

I have over time slowly built and improved my single DNN website. I have made many performance improvements in the way my modules make DNN calls. I even setup my own custom used cache of some DNN data (such as I read in dnn 3.3 release they now cache more module\tab info which I had to cache myself as in my specific case it really slowed my site down).

So the end result is my site is now very fast. And I only cache what I considered redundant DNN data (module\tab info\portal alias) that is called everytime I load a page. I also sacrificed a lot of flexibility of using generic DNN interfaces with more specific to my website code that I know is optimized for performance. And my custom modules are designed for my site and thus I don't make a lot of DNN calls such as using roles. I also put all my member non essential profile data in another table as I was having performance problems (probably partially due to my inexperience) with the MS membership provider.

So if you are looking to spend a lot of time on a single site then definitely you can work with DNN to get its performance where you want it to be. I am very happy with it. And then out of the box it's still pretty fast too if you set it up right. And I'm guessing with 3.3 it would be even faster.

Jason

ps: I will add a disclaimer that my site is fast for users. I did not customize admin functionality and so the dnn admin functunality can be a bit slow for me.


Jason Koskimaki
MAKI Software
 
New Post
7/25/2006 5:24 PM
 

Depends Jason, there's only so much processor alotment unless you are on your own server that .net can consume on one instance.  It's actually a good idea though and I didn't even think about it, because we prefetch all that info as well, and it's nothing for us to cache it for a short duration of time. 

But personally? 

I'd rather see the code optimized and not needed to be cached.  That simply adds working set overhead, and I've seen a few sites, that just loading up DNN 3.x is nearing the 90% of the max threshold before .NET performs a recycle.  if .NET is forced to recycle during your busy periods because of overused cache, then that defeats the entire purpose.  I've seen some in a continual or nearly continual recycle mode.  ie: module instance cache on a ML site would in theory create an in memory copy of every module for each language.  ugh.

Granted we all hate code maintenance - and if you look through the core team blogs, you'll see comments on adding more and more features, which unfortunately, just already add more to an already overburdened system.  Unless somewhere along the lines I've missed something and DNN core team is thinking that it's long term objective is to run solely on dedicated servers.

 

 
Previous
 
Next
HomeHomeOur CommunityOur CommunityGeneral Discuss...General Discuss...DotNetNuke Speed - LetDotNetNuke Speed - Let's be honest


These Forums are dedicated to discussion of DNN Platform and Evoq Solutions.

For the benefit of the community and to protect the integrity of the ecosystem, please observe the following posting guidelines:

  1. No Advertising. This includes promotion of commercial and non-commercial products or services which are not directly related to DNN.
  2. No vendor trolling / poaching. If someone posts about a vendor issue, allow the vendor or other customers to respond. Any post that looks like trolling / poaching will be removed.
  3. Discussion or promotion of DNN Platform product releases under a different brand name are strictly prohibited.
  4. No Flaming or Trolling.
  5. No Profanity, Racism, or Prejudice.
  6. Site Moderators have the final word on approving / removing a thread or post or comment.
  7. English language posting only, please.
What is Liquid Content?
Find Out
What is Liquid Content?
Find Out
What is Liquid Content?
Find Out