Disclaimer: I do not claim to be an SEO "expert", and I personally do not believe that such a thing exists outside of the walls of [put your favorite search engine name here].
John: How would a spider/bot know if there is a different version of the textual links if it never sees it? In addition, wouldn't it seem to be a good practice to plan for text browsers, regardless of whether they are bots or not? Serving up optimized content for the other browser would improve the end-user experience. Why would a site be penalized by search engines for trying to appeal to a broader audience? Some would call this practice "accessibility". Although I can quickly see how such practices could easily be exploited, the major search engines have algorithms that attempt to balance such implementations against known exploitation practices.
Jon Henning (SolPartMenu creator) briefly touches this topic in the comments of this page:
http://www.dotnetnuke.com/Products/Development/Projects/CoreClientAPI/tabid/851/EntryID/1487/Default.aspx
Can you please offer more insight into the SolPartMenu not being SEO friendly? Judging from all of the discussions I have seen so far (not too many), this has not been verified. Case in point, the DNN web site has used the SolPartMenu for some time now, and it was and continues to be well indexed. (However, it would be MUCH better if SEO was a higher priority. Many pages are missing search engine optimized content, descriptions, and keywords.)
However, I always like to keep in mind that what is SEO friendly last week or today may not be SEO friendly next week. And that very same thing may become SEO friendly again in several months. From the SEO feeds that I subscribe to, I have quickly learned that this is a very volatile expertise. The best rule of thumb is to produce well-formed HTML, quality content, and maintain as much accessibility as possible.