Web Application Problem Domain

It seems to me that some ProblemDomains are well suited to web applications and some are not. Where does one draw the line between what makes a good web application and what does not? I would submit that if you have to use JavaScript, one-pixel spacers or other tricks to get around the limitations of HTML, chances are your application probably should not be a web application. --BruceIde


QuickQuestions

2005-01-16

Bots and Website programming

Q Came across this new ClientSideVsServerSideAiFunctionality page (no backlinks prior) where bots were mentioned. And elsewhere I found my web-archive search failed due to robots.txt considerations. So are bots the concern of "Web site programming", or the concerns of "learning HTML and Css", or ????

A A bit of both. On the Web site programming side, you don't need to create a RobotsDotTxt file unless there's something you don't want indexed. On the HTML side, you can use <meta name="robots"> tags to prevent indexing of particular pages, or the recently suggested link attribute rel="NoFollow" to prevent indexing of specific links. Keep in mind that all of these mechanisms are only recommendations to robots and spiders -- nothing prevents a 'bot or spider from ignoring them.

A bit of history: the rel="NoFollow" attribute for links was proposed by Google in January of 2005 to combat comment spam in blogs. The StandardForRobotExclusion? (the robots.txt proposal) was a group effort of the members of the robots mailing list circa 1994. An Internet Draft was written, but it never became an RFC. For more information, see http://www.robotstxt.org/wc/robots.html.


In my opinion, web interfaces are best for output-intensive applications and not for input-intensive ones. Web forms are a poor and clunky substitute for "real" business-form GUI front-ends.

An exception to the output rule is paper-based. Web results are hard to make pretty on paper, especially multi-page stuff. Tools are starting to appear, but it may be several years before they are mature. -- 24.30.160.130


The online book "Dive Into Accessibility" at http://diveintoaccessibility.org/ is most suited for people learning to develop WebApplications.

Another guide to Web Publishing is http://philip.greenspun.com/panda/, which is part of a PHD thesis. Last updated Jun03


MicrosoftWay references

For client server apps that require use of WebBrowser control see http://msdn.microsoft.com/library/default.asp?url=/workshop/browser/webbrowser/browser_control_node_entry.asp

An InternetExplorer blog is located at http://blogs.msdn.com/ie/default.aspx. It has material related to IE7 developments and insights.

Even if the use of IE7 provide backward compatibility of applications using previous WebBrowser controls, it may be useful to bear in mind Microsoft based technologies obsolete faster than competitive products. This is because MS tend to align itself with the NextBigThing faster, and can only do so if it breaks the support for legacy applications, including those built using previous MS technologies.


See also: BrowserAbuseSyndrome

CategoryEnterpriseComputingConcerns


EditText of this page (last edited June 20, 2005) or FindPage with title or text search