Ntc Security Rant

It's much too easy to pick up malware on the web unless you stick to a narrow set of pages, which defeats the idea of the web. One proposed solution is to have a mark or tag for websites indicating that they are "Non-Turing-Complete" (NTC), meaning they don't run any scripts or executables, and rely only on declarative standards (such as HTML or improved cousins). It's much more difficult to include malware in declarative standards. The majority web use doesn't really need TC; it's just used for eye-candy and gimmicks. The value-to-risk ratio is too low. (Eye-candy may be fine for consumer "toys", but are not worth it for many domains trying to get real work done.)

If a browser is switched to only accept NTC or prompt when TC encountered, then one can avoid risky TC sites. A site marked as NTC would not accept TC code and scripting by the browser, producing an error message or notice tag in the corner.

However, this convention would need to get wide-spread industry and browser support to be effective. Special logos may help identify the NTC sites.

Those sites that rely on TC (scripting) could have a standard for displaying the reason they use TC coding, such as, "Has a re-sizable data-entry grid that [standard X] cannot duplicate". Sites would be too ashamed to offer explanations such as, "Has a dancing kitten that responds to mouse commands". Sure, if you really want to see the dancing kitten, you click "Continue to site" or whatnot. But at least it would discourage the use of TC for minor trinkets or side shows. It would discourage the use of TC for trivial needs, encouraging designers to use NTC when it's "good enough".

While it is possible now for a browser to lock out TC, many sites won't work. But those that don't work are usually doing things that could be done just fine under NTC if they were designed "properly". The penalty for making a site rely on TC is almost nil. This proposal would change that.

-- top

Can you provide examples of "easy to pick up malware"? My experience is that it requires a certain concerted effort to pick up malware.

My evidence is merely observational anecdotes. But even if I had examples, you wouldn't want to ruin your system to check them out. It may also depend on the person. For example, I like to hunt around for obscure music being that I have "non-typical" tastes in music. But, digging around in the dark corners of the web seems to increase the chance of picking up something nasty. - t

In "digging around in the dark corners of the web" (!?) have you ever picked up "something nasty"? If so, how?

I don't know how. I suspect through JavaScript, Flash, Java applets, etc. I usually turn Flash and applets off, but some pages require it. Besides, you cannot expect the general public to know what to switch off when. They just want to browse, not learn about programming languages.

If you don't know how, then why do you suspect JavaScript, Flash, Java applets, etc.? Perhaps you picked up "something nasty" through one of the more typical vectors like malware-infested desktop software or email-based phishing.

No. And I don't have desktop email on the problem machine(s). Let me ask you this: how does one know the difference between a "good" site and a bad one?

Still, why do you suspect JavaScript, Flash, Java applets, etc.? Perhaps you picked up "something nasty" through malware-infested desktop software. Perhaps you're using an inherently unsafe operating system, like a poorly-configured Windows system. You might wish to confine your questionable browsing (assuming it's that, and it's the source of your problems) to a Linux machine, throw-away virtual machine, or (say) an iPad.

I don't care whether a site is "good" or "bad". No matter what it does, it can't do anything to me and I don't need your "NTC" to help me with that.

Such conviction and certainty in your security.


The "Masses"

Re: "Perhaps you're using an inherently unsafe operating system, like a poorly-configured Windows system."

This is not just about me, or technical people in general. The general population has poorly configured computers and all the nagging in the world won't change that. This is not just about techie practices.

I will admit I forgot to keep my Java RTE up-to-date once, and this may have been the infection vector in one case (earlier versions had known security flaws.) But we are all human and sometimes forget stuff.


This idea would never be accepted.

You would be better luck developing a system that provides security without sacrificing power. Sandboxing, hardware virtualization, object capability model, etc. could contribute. I think ocaps are the most promising since they support secure mashups, but more investment is in sandboxing (e.g. with Google Native Client).

There are many non-security reasons to limit power, or at least push control of it closer to the user - e.g. accessibility (screen readers, language translation, transform for different devices, persistence, upgrade and deprecation, efficient ZoomableUserInterface, more predictable behavior and failure modes, scalability and optimizability, and so on. I think these would be better reasons than security to rant against TuringComplete imperative scripting.

That would require a heavy effort by browser makers. Plus, as the web grows in prominence, the "sandbox" is the OS and "personal folders" of today. It's a moving goalpost. Further, I don't see significant cooperation from Microsoft because tying the browser to the desktop is their source of revenue.

Browser developers are making this 'heavy effort'. JavaScript is already a sandbox. DHTML now has standards for canvases, WebGl?, and client-side storage. Google's Native Client is available today. The goalpost is moving and the big browser developers are happy to compete by following it. Microsoft is competing with its own proprietary sandboxes - SilverLight, DotNet - but doesn't really have the power to stop anything. What point are you trying to make?

JavaScript if far from hole-free, and new holes keep being found.

You have my attention. Can you provide some references to recent holes in JavaScript that actually allow malware to infect my machine?

TC is inherently more difficult to secure than NTC, sandboxes or not.

This is untrue. We can develop TC languages that are easy to secure. And we can develop NTC languages that are difficult to secure.

And yes, Microsoft does have the power to stop or slow standards. IE is still the primary corporate browser, and big co's don't like change. Look how difficult it is to rid IE 6 from corporate usage due to legacy support issues.

One of MS's more sneaky tricks is to extend standards with fancy and/or catchy features. Developers dig them and tie the app to MS to have these extensions. If a new standard or standard feature later comes out that threatens MS, MS either doesn't include it, or includes a crippled version of it. Thus, developers don't rely on that feature and can't tell users to not use MS browsers because so many apps, including their own, use MS-only features.

I do not deny that Microsoft has the power to slow the adoption of standards. Yet, InternetExplorer still competes in a process whose ultimate conclusion is to marginalize desktop applications.

You seem to believe an Open Source Deity controls web trends. Most users don't care about a desk-top holy war, or even understand it. MS often uses this "flaw" to lock users to the Windows desktop. For example, a free gizmo that ties a web app to their desktop to make it easier to send photos to grandma, or whatnot.

If there is such a deity, its name is probably 'DubyaThreeCee?'.

You seem to equate 'standard' to 'open source'. That's somewhat idealistic, but not a bad ideal.

I believe web trends are controlled by the normal set of market forces. Here are the two primary forces involved:

Together, these forces imply the emergence of standards. It's an emergent effect, like the fact that entropy increases and gas expands to fill a volume. Those conditions aren't required by physics, but emerge based on probability and path of least resistance. Developing standards is the path of least resistance for these market forces. Standards will always emerge (whether de-facto or de-jure) as a natural consequence of services providers trying to extend their reach and clients trying to consolidate and integrate the services.

No deity is required. GameTheory is sufficient.

Further, the open-source community may not be sufficiently motivated to tighten-up TC security. Most open-source developers are tech-savvy enough to have a much reduced risk compared to the average Joe such that they may not give TC security a sufficiently-high priority. Just because better TC security is possible doesn't necessarily mean there is sufficient motivation to take the extra steps.

TuringComplete is not a valid distinction with respect to security: it does not determine authority in any meaningful way, nor even does NTC ensure termination (especially not in a reasonable time). NTC languages are plenty vulnerable to security risks, a common example being SqlInjection. The only relevant issue is authority and controlling who wields it; everything else is delusion.

SqlInjection is primary caused by outdated API's and interface protocols, such as ODBC and JDBC.

Do you have a point?

NTC is easier to control still. The existence of injections doesn't change that. Just about any text-based language/protocol is going to have injection risks (but injection isn't limited to just text). Further, SQL is very close to being TuringComplete. The closer a language/protocol is to being TC, the more it takes on the added risks of TC. -t

"Outdated APIs and interface protocols" as the cause of a security issue is hardly a condition unique to SqlInjection, so I'm not seeing your point. And there are plenty of TuringComplete models that are, in fact, easy to control - e.g. EeLanguage. Can you find something better than argument by repetition to defend your position that TC is the cause of the security problem, as opposed to merely correlated with it?


{Huh? The "sandbox" of today is the browser.}

And a hole-y one.

{In what way? It strikes me that your proposal is based on supposition (and superstition), not truth.}

So you are claiming that existing browsers are not the major source of today's malware?

{Despite whatever breathless journalism you might have read, browser software is not at fault behind the majority of today's installed malware. The majority of it (assuming an updated browser/OS combination, of course) comes via downloaded desktop software -- "install this to protect your machine!", various games & utilities, etc. -- or email phishing. Downloaded software is typically obtained via the Web, of course, but the social engineering involved is not the browser's fault and doesn't rely on scripting, and it's been relatively trivial to improve browser software to recognise and avoid typical threats of the scriptable sort.}

ActiveX is a risk in InternetExplorer. But we can't blame web standards for Microsoft's junk.

{I don't recall ActiveX controls running in any relatively recent version of InternetExplorer without a plethora of warnings. If you continue to load them after having been warned... Well, you're probably the sort that continues to drive after the oil pressure and temperature warning lights have turned on, and then complain to the dealer that the car should have warned you the engine was going to be destroyed. Hey, Top, did some of those "music" (yeah, right) sites ask you to install ActiveX controls? :-}

No. The few that did I skipped.


Empirical Research

Windows security breach cause survey:

http://net-security.org/malware_news.php?id=1863

Note that it doesn't mention email attachments. Whether this is because they are a small cause, or because they were not researched in this study is unknown.


Recently there's been a spat of web malware that tries to trick one into downloading or running something. They may disguise themselves as an operating system message, anti-virus profile update, etc. Web 2.0 interfaces can be made to look and act more and more like desktop interfaces.

True. But NTC websites won't help against social hacking.

Well, that depends. If you are less likely to be at a site that can serve out GUI trickery, then you are less likely to be tricked. A very small portion of one's usage would need to be at TC-requiring sites. Now I will agree that if the declarative interface has features that allow it to mimic desktop applications, such is more likely. So far, I haven't see such, but never tested that aspect in detail.

Anyone who can be fooled by social hacking -- which is laughably obvious in every case I've ever seen -- is unlikely to notice the distinction between TC and NTC sites, or care.

There's a wide variety of social hacking (and semi-social hacking). The devil's in the details.


CategorySecurity


EditText of this page (last edited November 22, 2011) or FindPage with title or text search