3 hours ago by o8r3oFTZPE
"For example, if you want to see Microsoft have a heart attack, talk about the idea of defining legal liability for bad code in a commercial product."
That sort of discussion is quickly dismissed on HN. And probably elsewhere on the web/over the internet.
Instead we frequently see discussion blaming users of the software, i.e., Microsoft's customers, or even suggestions to make the customer liable, or comments from "security experts" on how Microsoft has made such amazing strides in securing Windows (a tangent). In the real world, outside the Redmond/Silicon Valley monopoly space, how many mistakes does someone have to make before we start to suspect there might be problems with relying on that person's work. Even more, how many times do we hire someone knowing they have made 500+ mistakes in prior work leading up to their application.
If Microsoft products are so infallible when used as instructed by Microsoft, then why would Microsoft have a heart attack as Snowden suggests. What a remarkable state of affairs we have today where employers such as Microsoft can call their employees "engineers", and yet both the employer and employees are absconded from any liability for the so-called engineer's work. The number of "second chances" Microsoft gets is nothing short of astounding. A bit like the number of pardons we allow to Google or Facebook for privacy infractions. Infinite.
https://www.nspe.org/resources/professional-liability/liabil...
2 hours ago by dmoy
Most of the people I went to Uni with ended up in fields where the companies are liable for bad stuff, to a certain degree. It does exist. However:
* you get paid a lot less
* the companies and industries move very slowly
* you spend a lot more time writing long-form, some time just re-using existing stuff wholesale, and almost no time building actually new things
I mean like Real Engineering fields. What we do in software is not real engineering, not even close. That has pros and cons.
I saw someone else talking about Rust, but I don't think that's what would happen in such a world. Rust is too new, if the company was actually liable for problems, the legal arm of the company wouldn't let you use it. I think what would happen is that everything would slow way down, half or more of the people working on code right now would lose their jobs, hobby programming would either disappear or become very insular and not well-distributed (because if companies are liable, then individual people will also get sued for bad software), and you'd spend most of your time working with small pieces of 30-40 year old technology.
I think that eventually, software may get to such a point. Just, be careful what you wish for.
2 hours ago by nextos
I completed a MSc in Formal Methods a decade ago, and I've worked in software projects where the level of rigor was equivalent or superior to any classical engineering field. For example, railway signaling or some real time control systems. We handed in complex artifacts that have had zero defects throughout their lifetime (> 15 years).
I believe lightweight formal methods are quite promising and might let software move relatively quickly and economically while retaining some rigor. Look into Liquid Haskell for some ideas that might become mainstream.
a few seconds ago by kreelman
Liquid Haskell looks similar to Eiffel in terms of contracts.
I imagine a whole bunch of power could be derived from this in Haskell. I don't know how heavily contracts were/are used in Eiffel (I don't think it's so popular these days).
It would be great to know if contracts had a measurably useful outcome on projects and what that measurable was (addressed a market that we otherwise wouldn't have been able to, dropped runtime errors to only system based errors, etc).
WRT your projects, I imagine it would be great to see your product out there working with a known level of uptime and quality. I could imagine it's a bit of a shift from the normal "throw it together" of ... a lot of software.
I'd be curious to know what it's like to work on day to day, year to year. I imagine lots of software will still be "throw it together" for a long time yet, but even if the subsystems are formally verified, it could have a useful impact on the software that consumes these (verified) modules.
29 minutes ago by throwmeawayya
How does liquid haskell compare to typescript?
an hour ago by dmoy
I personally think it would be excellent if that was the only way you were allowed to code. Though I'd probably lose my job haha. I'd sleep better at night.
an hour ago by Veserv
The gp is arguing that companies should be held liable for the harm that they can and do cause. You are countering that argument by claiming that doing so would require all companies to adopt onerous measures. However, that counter argument is only valid if we assume that all companies can cause the same amount of harm and thus have equal liability, and that doing so is unavoidable.
That assumption is deeply flawed. We do not hold toy car manufacturers to the same standards as actual car manufacturers. We do not hold every manufacturer of screws to the same standards as the manufacturers of screws on airplanes. Or rather, we do hold them to the same standards, just we know that certain use cases basically can not cause too much harm in the event of failure and thus in practice the standards needed to mitigate the worst case are much lower.
Software liability does not mean that everybody suddenly needs to take the same care as safety-critical industries. It only means that if you are making safety-critical software and you are incapable of separating the safety of the critical components from the non-critical components. What it really means is the repudiation of the one-size-fits-all lowest common denominator expectation of quality.
an hour ago by Spooky23
Liability just means more controls to avoid blame and tighter specifications. Malpractice laws donāt make doctors less dangerous, they mostly encourage ass covering exercises.
I worked at a place that had a formally verified application running on some mainframe. It was wonderful, except that the process was excruciating and maintaining that validation prevented any changes. Every code change cost a minimum of $25,000 2002 dollars.
It was dumb. They would have been better off with a paper process and army of clerks.
an hour ago by nyolfen
how does linux fare in this scenario? few things are as critical in terms of infrastructure
2 hours ago by zarzavat
Actually it's simpler. People and organizations would just move to a jurisdiction where such liability laws didn't exist. Apple would move. Microsoft would move. Google would move.
And then the US would be forced to decide whether to accept imports of foreign devices and software (created under the no-liability framework) or to stay with homegrown technology frozen in time.
The best thing you can say about this proposed reform is that it would make a great plot for a sci-fi novel.
2 hours ago by imglorp
> fields where the companies are liable for bad stuff...
> * the companies and industries move very slowly
The reason the rest of us move so fast is we skip safety, security, quality, and maintainability to get to market. Those things are usually perceived "not bringing immediate customer value".
34 minutes ago by throwmeawayya
> The reason the rest of us move so fast is we skip safety, security, quality, and maintainability to get to market.
And deliver cheaper results.
2 hours ago by chevill
>How many mistakes does someone have to make before we start to suspect there might be problems with relying on that person's work.
There's a lot of legitimate criticisms about modern OS security, whether we're talking about Linux/Android, MacOS/IOS, or Windows. However, we can't ignore the scope of these programs. Supposedly Windows 10 is approximately 50 million lines of code and due to its overwhelming popularity it has almost certainly been targeted more often than all of the other OS's listed combined. I am pretty sure all of these operating systems are > 10 million lines of code.
Whose work are you going to rely on instead? The level of security in these OS's isn't equal across the board, but I assure you zero days exist for all of them and barring some kind of miraculous technological breakthrough, they'll continue to pop-up from time to time as long as they exist.
Suppose someone pulled off a miracle by making a security-focused OS that's easy for non-technical people to install and use and actually gains enough traction to establish a market share. If such a thing existed it would likely get lots of things right where others have failed, but they would also likely get lots of things wrong. It doesn't mean we shouldn't try and it doesn't mean we shouldn't encourage both old and new companies to try and improve the situation, it just means that its an incredibly difficult and likely never-ending task. Security is a process, not an achievement.
an hour ago by failuser
Does it need to be 50 million lines of code? When you design with security in mind you might have to prune old code and drop some risky optimizations, probably drop some features. Using a higher-level language might help reduce the line count as well at the cost of performance and memory consumption.
an hour ago by chevill
>Does it need to be 50 million lines of code?
Probably not and they should try to reduce the size if it makes sense. However, if all of the OS's I listed are above 10 million lines even in the best case scenario a modern operating system isn't going to be anything less than an overwhelmingly large and complex program.
2 hours ago by stickfigure
We tried this with general aviation. Private plane manufacturers all went bankrupt, and now the minimum price for a new airplane is in the hundreds of thousands of dollars.
Apply strict liability to software, and you'll see the same results. Every piece of software will have to be constructed with the care of a medical device. Expect most forms of technological progress to come to a halt. Some part of the HN crowd will post "I want that" from their iphone (which wouldn't exist under such a regulatory scheme).
an hour ago by SamuelAdams
Iām reading a book called āAn American Sicknessā and it discusses medical devices. Turns out a lot of them are pretty poor and often have less testing and verification than most people think.
Thereās one story about a hip implant that went bad. Turns out the doctor recommending and performing the surgery was also the patent holder and had a vested interest in getting this particular implant in as many patients as possible. Turns out the patient was actually patient #8 who received this particular implant. Also the implant wasnāt fully approved yet and the FDA simply trusted the doctor to monitor the device for problems.
Also this isnāt isolated. The chapter has several examples of medical devices going into patients and patients experience negative health outcomes. Turns out laws are only as good as the agencies that enforce them.
2 hours ago by vineyardmike
> Even more, how many times do we hire someone knowing they have made 500+ mistakes in prior work leading up to their application.
While i don't disagree with the point you're making entirely, I have made many mistakes in my career and i still get hired and i suspect you have too. Most people make mistakes, and still have jobs.
Realistically, people/orgs make mistakes. Requirements change. Expectations change, security practices change, etc.
Roman bridge engineers would likely struggle to build bridges at the scale and requirements they're built to in 2021 (miles long, tall, huge train weights, etc). Things change in technology jobs a LOT faster than typical NSPE engineering licensees' jobs. Its a different world. There are probably bridges under contruction today that were designed before the iphone.
3 hours ago by _wldu
Nice to see that he linked to rust as a safe alternative to the C's (C, C++, Objective-C). There is a place for Go and Java as well above the system level. Both perform well and are safe.
2 hours ago by zozbot234
Go is not safe for concurrent code, unfortunately. Specifically, it does not protect against data races on non-atomic types, which can lead to torn writes and break invariants that are required for safety.
9 minutes ago by azakai
You're right that Go has those flaws, and it would be nicer if they didn't exist. However, in practice Go is still far, far safer than C and C++. Use after free etc. are far easier to exploit than Go's races.
If you can write in 100% safe Rust, Swift, C#, Java, etc. - and not use any of the unsafe escape hatches they provide at all - that would be best. But using some small amount of unsafety in any of those languages, or using Go, would still be a huge improvement over the typical C or C++ codebase.
2 hours ago by lu4p
True, but still memory safe, also Go has a race detector included which is really easy to use https://blog.golang.org/race-detector
2 hours ago by novok
Can you do a security exploit with a concurrency bug although in golang? You may corrupt data, but can you cause remote code execution with it?
2 hours ago by kortilla
RCE in that process maybe not, but security exploit absolutely. Things like that are how can be the difference between admin and non admin privileges or an āauthenticated/not-authenticatedā scenario.
an hour ago by yellow_lead
Yes, not all RCEs are caused by memory bugs.
2 hours ago by mattgreenrocks
Corrupt data may break invariants, which widen the attack surface.
31 minutes ago by siliconc0w
I don't think you'd have to ban unsafe languages but you could pass a law that slowly decreased the amount of money the government could spend on 'unsafe' software (either through direct licensing or renting through clouds). The government is such a huge client to these companies that it'd immediately create a large financial incentive to migrate.
2 hours ago by 3pt14159
I'm not going to comment on Snowden's view of what liberal western states do when it comes to surveillance. I have my own opinion, but he's been right about stuff I'd disagreed with him in the past before so I'm gun shy about confronting his ideas again.
On the topic of unsafe language though, he's absolutely right. We don't have to put up with this. We could pass a law and ban new code in unsafe languages from national security threatening devices like phones or cyberphysical devices like self-driving cars and we would be the better for it in under a decade. We could even have taxes to give breathing room during a transitionary period to encourage it before outright banning it, but we don't. We don't because so often it is less of a hassle for the government to trust the private sector that it is to take some real position on the regulatory issues that matter. This will probably always be the case until the government is able to evaluate talent and pay salaries in accordance with that talent as the private sector is.
an hour ago by lifeisstillgood
I tend to bang on about software not as engineering but as literacy. It makes some sense even here - that bad code is as common as bad law - and often for the same reasons, politics, money, and hard questions
"Engineering" is a wide subject - the big stuff is carefully built and highly regulated - bridges and buildings. But as we go down the scale we see engineering give way to the problems of politics and money - tower blocks collapse for example, and then we see human level engineering - factory tools that try to meet inflicting goals, dangerous toys and so much more.
The software world should not beat itself up for not being like all those engineers - when lives are not on the line engineers get tied up just the same as the rest. And when lives are on the line, software and hardware engineering have learnt a few things - reduce the scope to the barest possible essentials - have a lot of redundancy and stick to well known designs.
4 hours ago by ngneer
Loved seeing language safety features called out. How can we prevent monoculture yet retain interoperability and profitability? Are these inherently at odds?
3 hours ago by mikewarot
Diversity can help improve interoperability, if everyone sticks to only the public APIs and protocols, you can cross-run unit tests and make sure that everyone passes all of the unit tests for all implementations, and yield the same results, the only difference should be in performance.
Like farming, monocultures seem profitable, until the pests show up, and can swamp out all of the advantages.
4 hours ago by bambax
> it is still hard for many people to accept that something that feels good may not in fact be good
This strikes me as surprising. I have always been taught the opposite: if it feels good, it's probably bad for you, or illegal, or immoral, or all three.
2 hours ago by tomc1985
Protestant Western propaganda at its finest
2 hours ago by koolba
> if it feels good, it's probably bad for you, or illegal, or immoral, or all three.
Pretty much. The only exceptions Iāve found are exercising and saunas.
3 hours ago by ASalazarMX
I'd take out the immoral, as it feels puritan. Also, there are nice things that are good, legal and moral, like stretching after a good night sleep, a professional massage, ad blocking, etc.
Maybe I'm being too literal here.
3 hours ago by mikewarot
I was all ready to post a reply... when I was hit by the darkest pattern of all... you must pay to do so, with no hint prior to that moment.
Daily digest email
Get a daily email with the the top stories from Hacker News. No spam, unsubscribe at any time.