While it is possible to use the TableAdapter method specified in the article I would not recommend it. DataSets are a very heavyweight datastructure, and typed datasets are even more heavyweight. If you look around at much of the early documentation on ASP.Net web development, you will see that Microsoft and the development community came to a concensus that the DataReader was more performant in the web environment and thus it became the prefered data access method for web development.
Still, many people do not care for the Data Access Layer that exists in DNN because of the large number of stored procedures, DAL classes/methods and BLL classes/methods that must be maintained. This can significantly slow development. To simplify this maintenance step, many people have resorted to using code generators to spit out the DAL and BLL code. There is a learning curve to make code generation techniques work, and I find that I often have a need to tweak the resulting methods so I am still stuck creating custom sprocs and the associated DAL and BLL methods.. Depending on the size and complexity of you data model, this is certainly a viable option.
The third data access method that I think more people should investigate is the use of an O/RM tool to handle the heavy lifting. In the last 2 or 3 years a lot of great options have emerged in the .Net space: nHibernate, LLBLGenPro, EntitySpaces, XPO, Wilson ORM, nBatis, Genome and SubSonic just to name a few of the more popular options. Each ORM is slightly different and can provide a lot of extra value over a standard generated or hand-coded DAL. With any decent ORM the need for stored procedures almost completely goes away. There are a few case where stored procedures are still desired, but these are the exception rather than the rule.
Instead most ORMs dynamically generate the appropriate queries and return a domain model that is pretty close to what you are used to working with. Most good ORMs will optimize the queries and do some caching to further improve performance.
Right now, my prefered data access tool is SubSonic. Because it is open-source, I can use it without any licensing costs. It implements the ActiveRecord pattern which is conceptually similar to how DNN uses controllers and info classes to expose your domain model. The biggest draw for me with SubSonic is that it uses a BuildProvider to dynamically generate my data layer at run-time. This means that changes to my data model are almost instantly available to my module (ok... this is stretching it a little... but it is close enough without going into specifics). This feature allows me to easily build my application and evolve the data model as I decide to add new features or rework the data model to better handle some use cases which I may not have thought about when first sitting down to code. I prefer a much more fluid and iterative development style that is closer to agile methodologies than the classic waterfall approach. Having a tool like SubSonic really simplifies things for me. Unfortunately, buildproviders do not work in medium trust environments. To handle this scenario SubSonic allows you to generate the dal that the buildprovider would normally dynamically generate. This means I can do development using the dynamic approach and then deploy using the generated DAL. Personally, even if Medium Trust was not an issue, I would still prefer the generated code in a production environment since I prefer for my code to be a little more "locked down" so I know exactly what code is being executed.
Anyway, I hope that helps.