Ad Image

The End User License Agreement Is Why You Got Hacked

The End User License Agreement Is Why You Got Hacked

The End User License Agreement Is Why You Got Hacked

Phil Brass, Senior Evangelist and Technical Advisement professional at DirectDefense, explains why the end user license agreement might be the reason your company got hacked. This article originally appeared in Insight Jam, an enterprise IT community that enables human conversation on AI.

Most products sold in the United States come with an implicit warranty, whether one is written or not. This includes a “warranty of merchantability,” which means that the product is fit for ordinary use and meets the standards expected for that type of product. And it consists of a “warranty of fitness for a particular purpose,” for example, if you let the seller guide you in choosing the product to buy for your purpose, then by default, there is a warranty that the product you purchased should actually be fit for that purpose. You could argue that these warranties are part of what makes American manufacturing so great.

You can only get out of these warranties if you sell the product “as-is,” explicitly denying warranty, stating that the product is not suitable for any particular use, or better yet if you don’t sell the product at all, you sell a license to use the product (which is still unsuitable for any particular use and comes as-is with all faults and no warranties). Oh, and also, you can’t test the product in any way or tell anyone else the results if you do test it. Every piece of software you buy has an End User License Agreement that contains all kinds of disclaimers like this. The software is not fit for any particular purpose. There is no warranty. The manufacturer will not be liable for any damages caused by this software.

Before he passed away in 2024, Ross Anderson wrote three editions of an amazing book called Security Engineering, which I absolutely love. You can download all the chapters for free and watch some lecture videos here. My favorite chapter is always the chapter on Economics. Ross explains that there is a fundamental problem in buying systems with some security built in by the manufacturer. If the manufacturer has to foot the bill for building security, but the consumer is the one who gets hosed if the security fails, what exactly is the manufacturer’s incentive to spend a bunch of money making the product secure? Governments have long recognized this problem, and the answer to the “what is the manufacturer’s incentive to make things secure” question is “warranties and liability.”

What we see in industries where warranties and liability are enforced, like automobiles, for example, is a constant research effort across the manufacturing spectrum from materials science, which has us producing better rubber for tires, safety glass for windows, stronger, lighter metals and composites for frames, to specific safety devices like radar, lane departure warnings, blind spot sensors, seat belts and air bags, to social artifacts like speed limits and stop signs and crash tests and safety ratings.

Making companies and industries pay for their mistakes drives research that makes mistakes less likely. As we get better at making tires, all consumers get access to this technology. We all benefit from research into metallurgy, composites, rubber, and glass. Improved technology becomes expected, and eventually even mandatory, and society as a whole benefits.

When we allow construction and growth of an industry, such as software, around the persistent avoidance of accountability, liability, and responsibility via End User License Agreements, we reduce the time and money spent on this type of broad-spectrum research. Nobody needs to figure out the technologies and techniques required to make actually secure software, because there’s no penalty or accountability if you don’t. And eventually, everybody gets used to the idea that all software is vulnerable and just kind of… accepts it.

In the software world, the material software is made of operating systems, software development kits, programming languages, application programming interfaces, frameworks, and application code. In the sixty years since IBM sold its first mainframe, we have seen very little effort by software manufacturers towards the kind of material science research that would make it easier to build secure applications from these materials. Academia does a lot of research in this area, but the software industry is reluctant to take that research and apply it due to the cost.

There are a lot of costs involved in developing and using better materials to make software, and because they have no liability if they don’t do these things, there is no incentive to make these improvements. Major software components like browsers and operating systems continue to be manufactured from materials like the C programming language, a language that makes it notoriously easy to create vulnerable programs.

Why is Windows still a bunch of C and C++ code? Because there has been no liability incentive strong enough to make Microsoft rewrite it. There are no giant research programs focused on making consumer software more secure. It is in their best interest to come up with low-cost low-effectiveness “software security” methodologies and lifecycles, to create the appearance of providing secure software, to declare that we are not to worry, the manufacturers with no liability or accountability will absolutely solve the insecure software problem, they are the only people who can do it. But Microsoft still patches 100+ vulnerabilities every month.

At some point, the software industry needs to grow up. It needs to be accountable and held liable for its failures. When this happens, we will finally see the drastic improvements in the materials software is made of, in how applications are developed, and in all the improvements we need, as an industry, to start producing software that isn’t constantly vulnerable.

There are positive signs on the horizon. The European Union’s Product Liability Directive (PLD) has been updated to explicitly include software. If software causes harm, its manufacturers can be held liable. This includes security vulnerabilities, so it is conceivable that if a vulnerable SSL VPN, for example, allows an attacker to gain access to a company’s network and encrypt all their data, the SSL VPN manufacturer could be held liable for some of the damages.

Time will tell how robust enforcement of the PLD against software manufacturers will be in the EU, and more importantly for this author, whether any hint of accountability will ever be applied to software firms here in America. But one thing is certain. The software exploitability crisis will never get better until stronger warranties and liabilities are enforced on software manufacturers, and we as an industry do the basic “materials science” type research and development on the technological basis of software.

Share This

Related Posts

Follow Solutions Review