Everything's broken and nobody's upset
Software doesn't work. I'm shocked at how often we put up with it. Here's just a few issues -
literally off the top of my head - that I personally dealt with last week.
- My iPhone 4s has 3 gigs of "OTHER" taking up space, according to iTunes. No one has any idea what other is and all the suggestions are to reset it completely or "delete and re-add your mail accounts." Seems like a problem to me when I have only 16 total gigs on the device!
- The Windows Indexing Service on my desktop has been running for 3 straight days. The answer? Delete and rebuild the index. That only took a day.
- I have 4 and sometimes 5 Contacts for every one Actual Human on my iPhone. I've linked them all, but duplicates still show up.
- My iMessage has one guy who chats me and the message will show up in any one of three guys with the same name. Whenever he chats me I have to back out and see which "him" it is coming from.
- I don't think Microsoft Outlook has ever "shut down cleanly."
- The iCloud Photo stream is supposed to show the last 1000 pictures across all my iOS devices. Mine shows 734. Dunno why. The answer? Uninstall, reinstall, stop, start, restart.
- Where's that email I sent you? Likely stuck in my Outlook Outbox.
- Gmail is almost as slow as Outlook now. Word is I should check for rogue apps with access to my Gmail via OAuth. There are none.
- UPDATE: Yes, I know how OAuth works, I've implemented versions of the spec. A Gmail engineer suggested that perhaps other authenticated clients (GMVault, Boomerang, or IMAP clients, etc) were getting in line and forcing synchronous access to my Gmail account. Gabriel Weinberg has blogged about Gmail slowness as well.
- I use Microsoft Lync (corporate chat) on my Desktops, two laptops, iPhone and iPad as well as in a VM or two. A few days back two of the Lync instances got into a virtual fight and started a loop where they'd log each other in and out declaring "you are logged into Lync from too many places." So basically, "Doctor, it hurts when I do this." "Don't do that."
- Final Cut Pro crashes when you scroll too fast while saving.
- My Calendar in Windows 8 is nothing but birthdays. Hundreds of useless duplicate birthdays of people I don't know.
- iPhoto is utterly unusable with more than a few thousand photos.
- Don't even get me started about iTunes.
- And Skype. Everything about the Skype UI. Especially resizing columns in Skype on a Mac.
- Google Chrome after version 19 or so changed the way it registers itself on Windows as the default browser and broke a half dozen apps (like Visual Studio) who look for specific registry keys that every other browser writes.
- I should get an Xbox achievement for every time I press "Clear" in the iPhone notification window.
- I've got two Microsoft Word documents that I wrote in Word that I can no longer open in Word as Word says "Those aren't Word documents."
- Three of my favorite websites lock up IE9 regularly. Two lock up Chrome. I never remember which is which.
All of this happened with a single week of actual work. There are likely a hundred more issues like this. Truly, it's death by a thousand paper cuts.
I work for Microsoft, have my personal life in Google, use Apple devices to access it and it all sucks.
Alone or in a crowd, no one cares.
Here's the worst part, I didn't spend any time on the phone with anyone about these issues. I didn't file bugs, send support tickets or email teams. Instead, I just Googled around and saw one of two possible scenarios for each issue.
- No one has ever seen this issue. You're alone and no one cares.
- Everyone has seen this issue. No one from the company believes everyone. You're with a crowd and no one cares.
Sadly, both of these scenarios ended in one feeling. Software doesn't work and no one cares.
How do we fix it?
Here we are in 2012 in a world of open standards on an open network, with angle brackets and curly braces flying at gigabit speeds and it's all a mess. Everyone sucks, equally and completely.
- Is this a speed problem? Are we feeling we have to develop too fast and loose?
- Is it a quality issue? Have we forgotten the art and science of Software QA?
- Is it a people problem? Are folks just not passionate about their software enough to fix it?
- UPDATE: It is a communication problem? Is it easy for users to report errors and annoyances?
I think it's all of the above. We need to care and we need the collective will to fix it. What do you think?
P.S. If you think I'm just whining, let me just say this. I'm am complaining not because it sucks, but because I KNOW we can do better.
Related Posts in this Three Part series on Software Quality
- Everything's broken and nobody's upset
- A Bug Report is a Gift
- Help your users record and report bugs with the Problem Steps Recorder
Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.
The report that pops up there will show the size of an app with it's added "documents & data", this might be the source of your huge "Otherness".
As an example: I have a reddit reader called Alien blue that weighs in at 24.8Mb, however it is using an extra 97.6 Mb for what I can only assume are caches and downloaded images. I've seen the twitter app on an iPhone take over more than 400 Mb which is just plain nuts.
Some apps like flipboard have a way of clearing the cache. Others you have to delete and reinstall to get your space back.
Hope this helps alleviate *one* of your problems. Me? I'm stuck on this one: https://discussions.apple.com/thread/3398548?start=60&tstart=0 and I see no light at the end of the tunnel.
Now it's all about triage and mitigation while still finding a way to plan new features for the next release. I got tired of all my Facebook friends showing up in my Windows Phone calendar (with alerts mind you) and found how to turn it off (somewhere in your Windows Live account settings) but still haven't bothered to do so when I upgraded to the Lumia 900 (30 days to the day before MSFT announced it won't support Windows Phone 8).
It's not that no one cares...it's about whom does this edge case affect? .5% of our users? Maybe that's worth looking at. Workaround in place? fuhgetaboutit.
It makes the open source model a little more attractive no? Got an itch, here's a back scratcher put it to use.
Many large enterprises have so much division of labor in the process of software development. The people that decide what should be delivered by R&D base it on revenue prospects...and fixing old features doesn't drive revenue so if you have the lead in the market there is little motivation to improve usability, general user experience, or even underlying code quality if it is "good enough" and "selling". Sad but true.
Eventually, the large enterprise can use up their goodwill and a new company can displace them by caring more, but it takes years and it is risky so a small business is less likely to take on the lion, rather go to blue ocean opportunities.
In some industries, software vendors that have been there for decades (think cobol) are still selling their crappy old "good enough" solutions because new software is so complex for some customers they would rather stick to what they know. You or I rarely see this in the network we run in because those people adopt latest and greatest but so many cannot, still.
The apathy for user experience is a very sad thing to me...because a happy customer will stay with your through thick and thin if they feel like you care enough. How much does the crappy iTunes interface show how Apple cares about the people making purchases? It is horrendous.
This forces producers to release both software and hardware that is tested much less than it should be.
Companies don't have the incentive to make "perfect" software because customers aren't prepared to pay what it would cost. We have a balance at the moment where most software mostly works and is pretty cheap (or free). People get more value out of new software that does new things than they do from perfecting existing software.
You're right, Scott. Software sucks. I've always felt that one of the reasons that things have gotten this bad is that there really isn't a penalty for shipping shit. Software, being licensed and not sold, has managed to get exempted from product liability lawsuits. Imagine the focus on quality if you could get sued for shipping buggy software.
You see this especially for software companies and how hard it is to even submit a bug report. Ever try to submit a bug report for any Office app? It's amazingly hard to find any place to put this info. All you're left with is forums. Google - you can't ever get a human for anything. Try finding any sort of support link other than forums on Google's sites anywhere. Support links point to FAQs, point to forums, point to FAQ and so on. The only way I can interpret that is that clearly they don't give a shit if things don't work or what their customers think. I've had a number of unresolved issues with Adsense with Google and NEVER EVER have gotten anybody to answer my questions on forums or otherwise...
And worst of all it's become the norm, so that nobody can really complain because the alternative is - well often there's no better alternative. You can get the same shitty service from some other company. So, basically it's put up or shut up.
The other issue is complexity of our society in general. We continue to build more and more complexity that depends on other already complex systems. We are achieving amazing things with technology (and not just computer related but also infrastructure, machinery etc.), but I also think that all this interaction of component pieces that nobody controls completely are causing side effects that can in many cases not be tested completely or successfully, resulting in the oddball interaction failures that you see (among other things). The stuff we use today is amazingly complex and I sometimes honestly wonder how any of it works at all. What we can do is pretty amazing... but all that complexity comes with a price of some 'instability'.
I often wonder whether we really are better off now than we were before 30+ years ago when computers took over our lives. We have so much more access to - everything, but it's also making our lives so much more complex and stressful and taken up with problems we never had back then. Every time we fix a computer problem you're effectively wasting time - you're not doing anything productive, we're throwing away valuable time that could be spent doing something productive or just enjoying life.
And then I go on back to my computer and hack away all night figuring out complex stuff. Yeah I'm a hypocrite too :-)
1. We do not (want to) pay - what quality do you expect for $1 or worse: pirated copies?
2. A lack of pride; developers do not aim high in quality instead they aim at quantity.
Anyone can write a program, hardly anyone is able to do it right.
If things are this broken for people who live and breath all this stuff, imagine how broken it is for the average Joe who doesn't have the skill or confidence to understand why software sucks.
My wife fully assumes that when something goes/wrong stops working on her device that it's something she did. The only specific app she'll blame for behaving "stupidly" is Facebook. Everything else is "Can you take a look at this? I'm after screwing it up". 99 times out of 100 it's nothing she did.
Of course small developers will have bugs as well but it is much easier for them to deliver a fix. You could email the developer of AdBlock about your problem and almost certainly get a response, good luck even finding an email address to contact Google about a bug.
Switch to using a Window Phone. You work for MS so that shouldn't be too hard.
Switch to a decent email client (I use gmail wherever possible).
Switch to an ad blocker which has a bias to false negatives rather than false positives.
Chrome works fine.
Switch to using professional apps for editing images and video. They ARE built to handle huge workloads.
As for all the Apple software, they never said it will work well, just that it looks pretty. (Try winamp)
The old joke says : "How does a software developer fix his cars. He shuts it down, exists then enters again and turns it on".
What can we do as users. File bugs and if it gets to problematic switch platforms. That is the only thing we can do.
I take the time to report problems i find... and shame the companies on twitter when they don't respond. Just did that in my last two tweets few days ago @abdu3000
What I really wanted to say, was that things are to compilcated. It's to many layers. I never stop being fascinated by how simple engines and cars are. Every farmer I know can take one apart and put it together again and make it work. There are no layers of abstractions of any kind. You step on a pedal, and you see things move.
Software? Not so much... There are lots of layers, and every single one will make your application behave in ways that you have no idea how to solve. Like Dep.Injection. A good Idea by it self, but now, every time theres something wrong, I have a 40 level deep callstack with no code that I know of. Is it worth it? I'm not that sure...
If things are this broken for people who live and breath all this stuff, imagine how broken it is for the average Joe who doesn't have the skill or confidence to understand why software sucks.
I'm about as close to an average Jane/software developer hybrid you can get; I didn't touch a computer besides for writing and e-mailing up until a year ago, when it was suggested that I try programming. I fell in love, and am now getting my masters degree in Software Development at QUT in Australia. I never fully understood the concept of ignorance is bliss until I started studying User Interaction design. Things like the Skype and iTunes UIs frustrated me before, but it never occurred to me that they could be changed. Now, spending so much of my time working with human cognition, learning, memory and how it impacts the design of software makes it difficult for me to turn on my laptop without wincing. Particularly when you like 10,000 miles from home, I rely heavily on Skype and GMail to stay in contact with my family. When these things don't work, it has a direct impact on so many people's lives. If my 92-year-old grandmother doesn't know how to find a contact on skype, she can't call me. While I am very new at designing software, it seems like a worthy goal to remember that human beings are the ones who have to use it, and no matter how interesting or funky my code may be, if it doesn't translate into something usable, it's utterly worthless.
I like the way you criticize the current state of development. And you are utterly right.
However, how long has mankind been programming? Right, only about 30 significant years. Thus, mostly only covering about 2 generations.
How long took it to get fire working, to implement decent steam engines, ...
Yes, it took a lot longer. And I'm sure in those days with every failing engine, they said "we can do better". But they didn't, it took more than 30 years to get those things working reliably.
So I think it is a matter of time and education to get all IT professionals to realize that their "quick workarounds" won't work in production and it will take sales A LOT LONGER to understand that we need more time to guarantee quality.
Your humble follower.
Other things in our world that are hard to get right, eg computer hardware, bridges, aeroplanes etc, these are hard to make, and so the cost of getting them right is not so large compared to the cost of making them in the first place. But when hacking together a piece of software that works for me (and only me and my situation) takes an hour, but ensuring I get it right for everyone in every situation takes weeks, psychologically this cost is very hard to swallow, and this impacts people at every level of organisations, from developers to managers to sales people to the CEO. Everyone has to swallow it to get it right, but if one person doesn't, then there will be pressure to take shortcuts, and so it will be wrong.
I would be very very happy, if someone would recommend a tool to resolve this. :)
People are developers to earn the money and not for own satisfaction they've created something usefull(mean some value). Empathy begins to lose.
Comanies are focused more on the numbers and board satisfaction that the customers.
After some time it will raise to a peak and lot of companies will rethink their thinking and acting.
I hope so. If not, we are in the middle of the s*@t!
I had to sign-up to many services, login to many things (itunes, icloud, appstore...) just to get a few things done.
The interface might look fancy, but it's far from being stable. I still couldn't connect to Twitter for some unknown reasons.
I think the solution is to cultivate a culture of building things the right way. No, not perfect; just build things that work.
It's quite hard and challenging.
Sorry. I was just trying to make a point here. Almost every bright guy who could think overall about a software and provide an end-to-end implementation, consider quite a bit of edge cases, etc - is now busy dreaming up his/her own startup (and making a bit of money, while at that).
Large corporations are busy solving scaling issues so that they can add the next billlion in the next few hours.
Nobody is interested in plain old and "mundane" work of optimization, performance, quality, etc.
One other thing. You mention Windows 8 and then IE9, shouldn't you be on IE10 (which is awesome)?
The bugs you list remind me is Spolsky's concept of Software Inventory ( http://www.joelonsoftware.com/items/2012/07/09.html ): you *can* build an infinitely ideal product, but you need *infinite * time. Whereas the market demands on new versions, new resolutions, etc
There's another thing about bugs which is also key in this issue: fixing them won't sell more licenses of vNext: people won't run to the store, yelling "OMG! They fixed bug 33422!!!1 I can't believe it!". They'll run to the store because new, shiny things have been added, like a completely new UI with ergonomic characteristics of which no-one really knows whether it's actually a step forward.
We all know software contains bugs, even though we did our best to fix them before RTM. But in the end, we have to admit that that label, 'RTM', is really just an arbitrary calendar event, not a landmark which says "0 bugs!". This means that even though something went RTM, it doesn't mean it's bug free, it simply means: "It's good enough that it won't kill kittens nor old ladies".
A wise man, who passed away way too soon unfortunately, once said to me: "Your motivation and ability to fix issues and bugs is part of the quality you want to provide", which means: even though at first glance your software might look stunning out of the box, if that essential part of the quality of the software is missing, i.e. when a bug pops up it gets fixed, pronto, your software isn't of the quality you think it is.
If I look at today's software development landscape, I see a tremendous amount of people trying to write applications without the necessary skills and knowledge, in languages and platforms which have had a bad track-record for years, yet no-one seems to care, or at least too little. Last year I was in an open spaces session and some guy explained that they would place people who just started programming as 'trainee' at customers for 'free' and after a few months these trainees became 'junior' programmers and the client had to pay a fee per hour. I asked him what he thought of the term 'fraud', and he didn't understand what I meant. I tried to explain to him that if you think a person who has no training at all and you let him 'learn on the job' for a few months, is suddenly able to write any form of software, you're delusional.
But with the lack of highly trained professional developers growing and growing each day, more and more people who can tell the difference between a keyboard and a mouse are hired to do 'dev work', as the client doesn't know better and is already happy someone is there to do the work.
I fear it only gets worse in the coming years. Frankly, I'm fed up with the bullshit pseudo devs who seem to pop up more and more every day, who cry 'I'm just learning, don't be so rude!' when you tell them their work pretty much doesn't cut it, while at the same time they try to keep up the charade that they're highly skilled and experienced.
Sadly this process of pseudo-devs which are seen as the true 'specialists', is in progress for some time now and will continue in the next decade I think.
Let's hope they'll learn something about what 'quality' means with respect to software, that 'quality' is more than just the bits making up the software. But I'm not optimistic.
We don't have infinite resources, and we get *a lot* more value from "more" software than we do from "perfect" software. It's got nothing to do with pride or skill and everything to do with actually being useful.
Who is motivated most to fix this bug? Guy that just found it.
Who can reproduce bug easily? Guy that just found it.
So if only guy that just found bug would have tools to fix it while being in app and if this process would be without hustle would then this guy just fix it?
Just trying to find solution...
2) Management knows absolutely nothing about development. Management was promoted from Sales, or was hired from some firm which made widgets for 30 years, or was the VP's Nephew's Friend's room-mate at Good Ol' Boy U. And they don't actually have to care to produce a mediocre, bug-riddled product - they have burndown charts they can point to, and time-tracking on work items. If the burndown line reaches the bottom, everything must be great, right? Ship it!
3) 2 isn't willing to pay so much as one cent more to get a more motivated developer than in 1. Because they don't even know that the developer isn't motivated. They don't understand the code. They barely understand the app. They can't tell the difference between the worst code imaginable and the most sublime, bug-free code in existence. If it runs, it must be great, right? Ship it!
4) New Developer Fresh Out Of School joins the business, and pretty rapidly learns that there's absolutely no point to doing a better job than "just good enough," because when she produces amazing code, no one actually cares one iota more than when she produces mediocre code. If you know the code is bad, don't say anything because you'll be accused of complaining or "not being a team player." Shut up and ship it!
5) Testing - The tester can't actually read code any better than the manager, doesn't understand how to use any tools outside of the testing suite and the software under test, and isn't actually given any time to learn. They don't know how the customers use the software, so they can just test the most basic functionality. All the test systems are in VMWare or Lab Manager, and are wiped and reimaged before each test (Why would you ever bother to test software on a computer that has OTHER software installed on?). If it works fine when you follow the instructions *precisely*, don't bother testing any more (you're holding up progress!) - ship it!
Those are the real obstacles. Commoditization of work. Disincentives for producing better work. Management that doesn't know anything about the business. Demotivational 'project management' that focuses on producing coloured charts instead of good software. Burning out new talent before they even have a chance to write good code. Failing to test beyond the most basic, vanilla scenarios.
That's the dream scenario for many of us. Companies don't make PDBs available. They try to obfuscate code and symbols as much as is possible. They hide or encrypt *everything*, regardless of whether or not there's a reason. They don't produce any logs or, when they DO produce logs, they're in some proprietary format that only the company's internal tools can decipher.
Microsoft makes this much easier in Windows with the public symbol server, most of the time, but when they fail to do so...
I have an issue with an IE9 security update and some other software. The issue shows up when an IE DLL is called, but there are no PDBs available for the version currently shipping - no one told the IE team to put the symbols up. Consequently, there's nothing that can be done at this point to debug or fix the issue, short of taking wild shots in the dark with code that otherwise works perfectly fine.
Sadly we none of us are. We are multinational corporations trading on the NASDAQ. We are employees who get paid an hourly rate. Someone else makes all of the important decisions.
Software is big and complicated. The only reason that people fund development is the expectation of large profits...
yep this wont work in current context of software business.
but we can dream about other context :) where every peace of software that run on your device has it's own mini IDE build in and mini source control and you can work on it's sources as easy as with app itself and you can share your code versions with others... we have peaces for building such context already and it could be that we only need to put them together? (and then fight and win against old software business models... :)
Your amazing, multi million line Windows desktop, the work of some 1000 people or more, has a problem with indexing.
The network and apps that connects you via email to everyone else on the planet, free, globally and instantly, sometimes loses a mail. Or is slow to load your new messages.
A program with which you can do what it took huge teams, million dollars of equipment, and professional expertise to do (FCP), has a crashing bug in some particular action.
The program that lets you talk to everybody on the planet, instantly, with video, and paying nothing, has a badly designed UI.
Yes, I can see how "everything is broken".
Because when you didn't have any of these, when 30 years before you had a rotating dial to dial numbers on your phone,
20 years before 20MB was a huge disk in a desktop system, and 10 years before something like video chat was only possible
in huge organizations with special software, everything was perfect...
"I read up to "literally off the top of my head" and face-palmed so hard that I went blind and couldn't finish the post."
YES. We have 3 literallys in that post, of which none *is* correct.
Note the "is"! There's also a "there are none" in the post.
"None" is "not one" abbreviated, thus singular. Should be "there is none"
Sorry to be a PITA but given that this is someone who expects near-perfection in software, I'd expect perfection in grammar on they're (jk) part.
Some are in Microsoft software, some in OEM apps/drivers (HTC, Nokia etc.) some in third-party apps.
Just some recent ones: very often I'm unable to enter in the marketplace app from the phone and to "fix" this I have to restart the phone, the phone "forgets" the phone numbers for 80% contacts after I change the SIM, no USSD or SIM toolkit support, no support for encrypted emails, Skype on WP7 does not run in background, Lync seems unable to connect to the server, an icon appears on lock screen telling me that I received notifications but there is no history with the notifications and the list continues..
First world problems....
It will only get fixed at 'fubar' (f**ked up beyond all recognition)
Filling bug report and follow-up should be standardized across the industry. A public wall of shame could be a bonus.
"You are doing to much refactoring, we need delivering"
"TDD just makes you lose efective coding time"
"It is imposible to folor the SOLID principles"
They are all bullshit, we are not a Sect, we just want to write better software.
Integration is another big issue in my opinion. You may expect all your apps to behave nicely across all OSes/browsers, but in reality they're not going to be tested thoroughly with even a small sample of every conceivable configuration that millions of users are going to be using.
A. First is what many have alluded to already - we want top-notch software, essentially for free. We have become accustomed to adding substantial function to our devices, as well as cloud-based/cross-platform/"unlimited" data and services at no charge.
B. With the explosion of the mobile space, the pressure increases to innovate and push the bleeding edge out to consumers faster. Iteration cycles become shorter. There is increasing competitive pressure to get new features out the door. This is especially true in a world where all of the players and platforms intersect in the web space. To me, this has an impact on the QA cycle and upon vendors ability to design for both forward and future compatibility.
C. "Standards" have become a moving target.
D. There are more, but it is early, and I have not yet finished my first cup of coffee.
To me, this all falls into the category I like to call the "Apollo" or "NASA" syndrome - IN 1969, the US put a man on the moon. Multiple times. Following this, they developed a re-usable space shuttle program, which operated successfully (with some caveats) for thirty + years. The complexity of these ventures (or most other space-program undertakings) is nearly unrivaled in the history of human technology. Yet, the biggest headlines pop up when things go WRONG.
Given the complexity inherent in our modern computing and software systems, what is amazing to me is not that there are bugs and compatibility issues, it is that there are not MORE of them.
Great post Mr. Hanselman, and spot-on. Just wanted to throw a different perspective out there.
Just listened to you on the 800th podcast show, on which you mentioned:
b) much of your complaints are concerned with the iPhone. So why not, as you say, "stop using it"? Just like I'm going to stop using Telerik Reporting, and JustCode.(Though to be fair, Telerik *do* listen)
So what's the answer? Education for everyone concerned with building software about what craftsmanship actual means and how to do it. Yes, that means practicing the technical practices, such as paying developers to take part in code retreats, coding dojos and other types of hand-on learning events.
The .NET developer community, in particular, seems myopic in it's resistance to change and process improvement. Since 'leaving the fold', I've been involved in production projects where pair programming, TDD, minimum viable product deliveries, on-site customers, etc... are a reality. Guess what? These practices work.
We don't farm with hoes and horse-drawn ploughs anymore, so why do we still build software based on archaic and out-dated practices?
Apparently, satisfaction is inversely proportional to internet use.
Life without the internet
BTW, here's what happened when I submitted the comment the first time:
An error has been encountered while processing the page. We have logged the error condition and are working to correct the problem. We apologize for any inconvenience.
Perhaps you should turn all that attention on your own stuff?
This is the why *nix, x86 PCs, PHP and a bunch of others things in the IT world are so prevalent.
i agree that a small part of this problem is a complexity issue. As a developer, it's difficult for me to know what my code is doing because i don't completely understand the stack underneath my applications and i tend to only learn more about it when i run into an issue.
i've also learned that just like life, situations in software aren't as cut and dry as i'd like them to be. Often, i find issues to be systemic. Often, it's me.
The hard part is hitting that wall and then being willing to put forth the effort to push through it in the name of quality and that does mean not listening to the part of my brain that says it's horribly boring work.
Trying to make a bug free software is like chasing our own tail.
Surely somebody's doing something! I'm sure if you look around you'll find a lot of people doing a lot of things to fix software quality and improve user experience in software applications.
But, consider what's really broken in the world: food supply, resource depletion, pollution, poverty, crime, violence, war... When I read a title like, "Everything's broken" those are the problems that come into my head. And so, I was disappointed to read your list. It didn't aim high enough for the problems I was considering.
Makes me feel one component what's broken is our priorities and focus. Clearly the priority and focus for the software you're using is not on quality and experience. It seems the software industry has optimized to get-product-out and iterate asap. Ship!
But then, when I consider the larger question of "what's broken?" where I look at the real issues in the world, I come to the same answer: the priority and focus of society is not tilted strongly enough towards fixing those types of big-world problems. Instead, we have so many of our great minds attacking other types of problems.
Generally, when we humans focus and prioritize, we can achieve just about anything we desire.
Iphone 5 is a good example. Do we really NEED a phone that is thinner and lighter with a slightly better camera? Not really... but the public wants that so apple is giving the public what they want. It does lead to poor quality though and less innovation in the software community. Without having a driver toward people who are really innovative companies will continue to ignore the problem and just keep developing the same thing over and over in a shinier package.
1. The definition of Quality
Quality is in the eye of the beholder.
2. The 80/20 rule
Bugs/issues are (rightly or wrongly) seen as the "20%" by management, it's not worth spending the time fixing them, as the percieved gain is so small, better to get new features out the door to get the competitive edge.
3. "One swallow does not make a summer"
Everyone's a programmer, or everyone's a designer or everyone's a web designer etc. etc. Because I have a pc and a copy of photoshop, I'm now a designer... or I've got a DSLR I'm now a photographer, I've bought some spanners and a book on plumbing, I'm now a plumber (actually I probably am!)
I used to be in this same problem at one time. Then I stepped back, looked at the problems and took control. Now I control the systems by using them more efficiently. Any process, be it computer based or not can easily get out of control. Just like a desk stacked with papers up to the ceiling your computer can become so overloaded with crap that it appears to be broken. Time to re-examine your use of these machines and start over. Not just with one app, but with the whole mess. Throw out everything. YES! Everything. And start as if you've never used a computer before. But, this time make sure you know what you are putting where and why.
Why do some people keep having issues like these? I think the answer is pretty simple, although not very welcome to most. It's because you _don't_ use the open tech available to you. After all, most of my geeky friends have issues like these, and neither do I.
But we don't use Word. We just write, just text, and there's nothing more to it. We don't use crazy complicated indexing file managers. We don't because there are too many moving parts. Too much stuff that breaks. And we need to get stuff done. I for one can't be expected to relearn my file manager every few years.
The same kind of "issues" could be said about the English language per-se (what do we pronounce this vowel here but not there) or about how these two plants in my windows are growing different if they receive the same light.
I think mainly because the massive scale in which consumer software is used, it has reached that kind of complexity that we see in other large systems and we should learn how to live with it. And by "live with it" I don't mean just put up with it. I mean we as developers need to account for it, expect it, and design systems that work gracefully even in some unexpected conditions. Users are learning to live it with one way or another.
Or do we just need to go back to usability testing before product launch in order to get rid of at least half your list?
We got used to several desktop crashes per day. My guess is that your contemporary machine hasn't needed a reboot for weeks.
We had applications which took a long time to do things which are now instant. We waited for modems, the modems often kicked us off. Our web pages loaded slowly.
You don't even know you're born!
I don't feel your pain. Not the slightest bit. Why? Because I don't run *any* of that stuff. Okay, except for Word sometimes, and that doesn't count because I run a really old version. It shouldn't come as a big surprise that if it's old it's probably more stable, and if it's new and *!FEATURE-FILLED!* it's probably immature and twitchy and doesn't play well with others.
Want less to write about? Run W2K and carry a dumb burner.
It reminds me of a problem I've run into with cross-platform calendar synchronization. I have a friend that has a birthday sometime in June. When June comes around, I'll look at the Calendar, and I'll see his birthday listed on June 12th, 13th, and 14th. Apparently he was born on 3 different but consecutive days. Somehow his birthday has spread like a virus. And I have no clue which day it *really* is, because I don't remember - that's why I put it on my calendar. And this happened regularly with a large number of friends.
My solution was to use a one-way iCal subscription instead of a 2-way sync.
"All software sucks"
-- [citation needed, but at least as old as I can remember in USENET]
Personally, I think a lot of the suckage these days comes from toolkits and deep stacks. When my software has problems it's sometimes really hard to know where to even start when there's at least (quick count) 5 layers between my code and the signals on the wire (my code, toolkit api, jitter/language, vm, os, tcp/ip stack or disk or other resource). More than likely its my problem, of course.
But when things go wrong and I suspect it's not my problem I don't have many choices except to shift the stack a bit and find another way to do it. There's no realistic way given the constraints of time and money to do anything else. OSS doesn't help much either. Who has the time?
I'm writing software with bugs (that I own) on top of a buggy, shifting stack of software that I don't own or control.
I never sync anything. I never upgrade anything. I never allow any app of any kind to notify me about anything in any way, and I avoid chat software like the plague.
I act on the assumption that most programmers and product managers are no good at anything, and consequently my day-to-day experience is remarkably serene, by contrast.
It's interesting how many comments have some element of "blame the victim" (e.g., you're using too many products, you'd be better off buying Chevrolet gasoline to go with that fancy Bel Air).
It's also interesting how many comments here focus on some specific problem (e.g., setting up an alternate account with one-way frobozzes will resynthesize the index deletions), while missing the big picture that these are symptoms. It's all bad.
As a profession, we can do better. We know how to do better. We've been taught how to do better since the 1970s, when modularity and data hiding really came into their own. But, alas, we're in a hurry, and doing better requires hard work: thinking. The problem is not that it's hard to enumerate a zillion test cases--that wouldn't be as much of an issue if we focused on getting things right in the first place, on designing for isolation and independence. Heck, maybe focus on designing at all, rather than on getting the latest greatest tiny update out as fast as possible.
But it's been clearly demonstrated that what the market wants is crap in a hurry, and that's what it gets. The problem is exacerbated by the purveyors' need to deliver constant "improvements" to existing functional products in order to garner more revenue, which in turn requires grossly incompatible changes with great frequency just to wean satisfied users away from working solutions and force them to adopt more expensive new (but hardly improved) technologies.
For more on this, read anything by Don Norman, or Why Software Sucks...and What You Can Do About It by David Platt.
You've received 80+ responses to your post on the very same day, the first coming minutes afterwards. The ability to create software, and indeed to complain about its quality online is the highest form of individual empowerment and communication capability we've ever seen. I don't disagree with your complaints, and I too believe that we can do better. But look how far we've come in just that last 20 - 30 years. It's just growing pains, and it happens with every new, significant technological advance. Having said all that, I believe that "the collective will to fix it" can be characterized in a single word: craftsmanship. We need more of that in our software.
If the answer is "not much" then we should move on by realizing what that means: users simply don't care enough about these paper cuts.
1. Silly non-printable character on the please wait popup when opening a WinForms form in the designer.
2. Errors saving WinForms that the only solution is to close VS.net and clear the temp directory and then restart. Really messes with Dev Express.
3. asp.net sites claiming compile errors because of temp directory crud not getting updated by VS.net when a change is made to a project that the website depends up on. Requires close of Vs.net and kill of temp folder.
4. If you have a ton of errors in the same file (common if you're refactoring by hand) and you start at the top of errors list and ever delete a line of code, all others in the errors list will be off by one line. It doesn't automatically update.
5. Vs.net 2012 routinely fails with intellisense. Only solution is to close the file and reopen it. Minor but annoying and new in VS.net 2012.
6. Package Manager Console in Vs.net 2012 project drop down is always blank so you can't pick a project to do things like EF Update-Database etc. Have to hack the manual commands. Yuk.
7. Windows 8 RTM, if you do a lot of copying and pasting (over and over again) of files, especially with drag and drop, Windows Explorer crashes without error. Doesn't kill the start menu interestingly.
8. Windows 8 doesn't let me have multiple metro apps up on multiple screens. Yuk. This one thing would have made the OS OK to use.
9. Windows 8 loss of start menu. Should have been replaced with a Windows Phone 7 style small version vertically scrolling of the main start screen. Pita.
10. Windows 8 native apps have screwy mouse support that didn't adapt and it's a shame. It should have been changed to work like a tablet and scroll by grabbing (click and hold) and highlighting and drag and drop should have been changed to click and hold longer like WP7. Then panorama and everything else would have worked great just like touch and people using a mouse wouldn't have hated it. (also works on a touch pad too)
11. Chrome routinely freaks out loading a google result that you click on and shows the previous page loaded instead of the new one.
12. Ever since google started redirecting through themselves on links instead of going directly to the link clicking through results is slow as hell. Bing is better but not great.
13. Microsoft please release a complete bluetooth stack that works with all bluetooth dongles and not just your own and has all profiles for all devices. The ones from the manufacturers SUCK.
14. Seriously you can't boot Windows Media Center on start up in Windows 8? Seriously?
15. Seriously you're not replacing Windows Media Center with a Windows 8 native set of apps? (see below)
My #1 biggest peeve:
Microsoft: Release a box that takes a cable card and has a power, Coax, USB 3 and CAT6 plug on the back. Work with the cable companies to automate cable card pairing and activation. Make it run Windows Embedded and automatically detect new hard drives plugged into the USB port and automatically add it to the drive pool for recording. Make it do nothing but handle the schedule,and record shows. Make it seriously cheap. Then provide an open interface that anyone can use to communicate with it and stream video, but create a consistent interface on Xbox 360 and Windows 8 in metro style. - Take over the TV world by doing this before Apples does it. Don't try and create your own cable company, waste of time for now. But cable card gives you the solution and a box that just works that xbox 360 can use and control with guide etc. or Windows 8 same way and any device can play anything and the box can record/live stream 6 shows at once (max cable card supports) and you're done. Ultimately work out deals with Dish and DirecTV to plugin as well with an adapter. I know the WMC group is disbanded but this is how you own it. Why are you not doing this? It's the logical next step for WMC and will hit a HUGE market fast especially if you have iOS, Android and WP clients and it can work in and out of the home. Head shake as to why this isn't happening yesterday? You should have released this 2 years ago or more when you brought out the Xbox dashboard with Metro Design language. DO NOT PUT THE RECORDING IN THE XBOX. Let multiple xboxs work as set top boxes. Work with the TV manufacturers to licence access to the boxes. Let other companies connect and create their own interfaces. Google wouldn't be able to compete, neither would apple if you do this right and it would assure Xbox 720 would own the console market too because Sony would be behind the 8 ball and if you patent it properly you could block everyone out.
Are folks just not passionate about their software enough to fix it? - No
I am a passionate developer, but I am not a genius.
I seek opportunities to learn from passionate geniuses, but my unfortunate experiences are that geniuses don’t get into details, they cost your time and money and create some issues, and then they tell you that your system sucks and leave
On the other hand, I've been running a debian Linux 2.6.18-6-k7 for more than 714 days, without interruption. While I encounter no bugs, I have no reason to upgrade it (and then I actually did upgrade debian to a whole new version to install new software last year, without having to restart it!).
When there are bugs in commerical software, programmers would need time to find and correct them. Therefore they'd need to be provided food, clothing, shelter, for them and their family. This would translate, in commercial enterprises, to money to be paid, while no new software would be sold, which would translate to a loss, bad quarterly results, falling share price, hangry shareholders, bad "economic statistics", bad GDP, pessimism, enterprises not hiring, unemployment. A lot of sad people.
On the other hand, if instead the corporation just increases the version number and start selling the buggy software, there's no expense, there's sales income, therefore benefit, therefore good quarterly results, increasing share price, happy share holders, good "economic statistics", good GDP, optimism, enterprises hiring, people get hired. Everybody's happy.
On the other hand, software that's not developed in a commercial environment, eg. GNU emacs, is delivered when its completed. There's no dead-line time limit for when a new version is release: the next version of GNU emacs is released when it's ready. The result is that while emacs is the application that I use the most (I always have it running), it's even more stable than the underlying linux system (which has to be rebooted when upgrading new drivers). On the above mentionned system, I have emacs instances that are running since more than one year.
There's also another consideration. Operating System research has been practically stopped since the late eighties. The fact that commercial corporations have standardized on IBM PC and then Microsoft Windows killed all the effervecent competition there was between various computer architectures and diverse operating systems, both from the commercial offer, and the academic research. See for example http://herpolhode.com/rob/utah2000.pdf
There are a few searchers who try to develop new OS concepts. For example Jonathan S. Shapiro was working on capability-based OS (eros-os and then coyotos), but he was stopped in his tracks, by being hired by Microsoft. Again, one has to find food, clothing and shelter for oneself and family, and in the current system, that means the commercial corporate world.
There is a very strong spirit of herd in humanity, so it's also hard to expect much. Yes, there may be bad systems, but as long as 85% of the people are using them, they keep using them. Sometimes for the good "network effects", sometimes for the economic mass effect (but nowadays we'd have the means to produce more personnalised products, so there's no strong reason to have billion of identical phones on the market), but more often just because the rest of the herd is doing the same.
Take your calendar birthday issue for example. What do you do with that feedback? Who owns that experience? I have an issue with the Videos app in Windows 8. I bought three seasons of "Avatar, the last Airbender" years ago and now every individual episode shows up there in a flat list. Worse still, none of them actually work. Who do I talk to about that problem? I have no idea.
Experiences like that seem to come from a soulless experience factory manned by mindless automatons interested only in parting you from your money. When we're able to put a name (and a blog, twitter account, etc.) to a user experience, then I'll think we'll see some real progress.
I think right now we need so many developers we will put up with crappy software. Good enough as become a status quo.
I have to constantly debug web apps for users (doctors, MDs) trying to use large corporate apps to collect information to help sick people get better. Sometimes I give up and that means "XYZ's" app really sucks.
I hope the industry is working through all this right now and we are in the middle of the change.. it feels this way to me. Change for developers and change for management to accept what is possible.
Technology for developers and technology for consumers are totally different beasts.
Which is why when something seemingly cool which when unleashed upon the general masses eventually grows to be complicated enough starts buckling under its own weight.
Also, the usage patterns go wild when the technology is in the hands of the consumers, some of which are not even thought of by the makers of the device / software applications.
Instant gratification and "don't make me think" is part of the problem as well IMHO.
When we purchase a car, we are aware that it behaves in a certain way, there are rules of the road and there is maintenance to be done to keep the car running.
Technology has no such boundaries, it can do whatever we can build it to.
Plus software not being a physical entity makes it much more complex to comprehend.
I feel these are growing pains in a very young industry and it will take some time before it matures.
"Wisely, and slow. They stumble that run fast." - William Shakespeare.
I think all of those skirt about the real issue. I believe it is about accountability, which wraps in all of those arguments.
Software doesn't *need* to work, and we agree to that every time we click the "I agree to these terms" box. There is no cost to turning out a failure-prone product so long as it is (to a sufficiently large audience) in some way more desirable than the alternatives, or it at least sells enough to pay back the development costs.
If Apple had to pay for your time spent fixing your iCloud Photo stream, Microsoft had to pay for the cycles wasted while the indexing service churned away at nothing, or Google had to reimburse you for lost business when Chrome made you bust a deadline because it screwed up Visual Studio... Those companies would either get out of the business or tighten up their code and make sure it played well with everything they could possibly test. QA budgets would skyrocket, and so would the cost of software.
I don't see any way to impose accountability, though, and I'm pretty sure that if we did, innovation would come to a crashing halt and then restart at only a glacial pace.
So, aside from perhaps being an interesting observation, I suppose none of that is very useful.
first of, throw your IGarbage products where they supposed to be and get a window phone 8. =P
I totally agree with your post, but we can't forget the factor that drives our lives in software development and engineering "Meet the dead line at any cost no matter what" of find yourself replaced by the next dude that wants to try.
nobody cares about QA as much as they should, scopes are constantly changed and dead lines as constantly static.
One of the last things he had published was an essay called "Nothing Works and Nobody Cares". This was in 1965.
So "everything's broken and nobody's upset" has been going on for at least 40 years now. It wouldn't shock me to find complaints from the Roman legions that the quality of swords and spears has been declining, and now you can't get through an entire campaign without using three or more swords, where in the old days you only needed one.
I'm not saying we shouldn't do anything about this (and I have my own long list of tools that don't work or are broken) but I think it might be worth taking the long view on this.
I work in software testing. Stories like this drive me more to push quality upstream and hold the line politically when errors are bad.
But yeah, when timelines are shorter (remember when huge software products took 5 years or more to get right?), quality requirements are lower (how many users will really do THAT?), and upper management now has less of a real connection to the software (since somehow now everyone wants to run a business rather than solve simple life problems).
That said, these instances all make me want to face-palm, then go into work and spend a little extra time selfhosting and working on integration testing. Because the last bit that I didn't read above (sorry, too many comments) is I believe that when we are all building more complex software, test teams (if you HAVE a test team; *cough* Facebook) spend more and more time focused on the complexity of their features and less on the experience of using the product.
If software engineers on the whole recognized the priority of software quality, then maybe the role of test engineer would be more common. Today, its just not.
It WILL get worse before it gets better though. Most engineers just don't get it. They use all the workarounds you mention as a part of daily life, then go back to work and keep ignoring the pain. Not until regular users just literally can't use the product anymore and don't buy it will this pattern change.
I feel your pain. Every day is a similar list for me and it drives me to do better testing software before other's get to it.
You didn't give much coverage to the whole security aspect of things (apart from AdBlocker)
For me, this is one of the most broken aspect of computing (at least on the windows platform) with just about every ounce of grunt that my quad core PC may have had, being taken up by virus checkers & firewalls etc.
Whilst the concept may now exactly be broken, the implementations certainly are. Just as we are now consigned to spend inordinate amounts of time in airports, due to terrorist threats, we are also doomed to never realize the full potential of computing performance improvements.
In general, though, the 'More haste, less speed' philosophy seems to have crept into most software products.
Just how many times a week does Adobe Flash get upgraded, for heavens sake? (Did anybody even notice the issues that the patches are addressing?)
Also, you're probably just getting old(er) - just like me ;)
I have five kids and at least 6 platforms to deal with in my house. My head is about ready to explode because I am the ONLY IT guy. I told one the other day that his grandfather died at 67 and his great grandfather died at 62 and I'm 58. Exactly how much time do I have to spend troubleshooting his print server?
The barriers to entry in this field are effectively zero.
The ability for the consuming public to ascertain expertise prior to engaging a product is almost zero.
The liability for false claims about a software product or service is effectively zero.
Developers can jump in, produce total crapware, cover their costs and move on, all while updating their resume.
What it comes down to in this field - like every other one - is the personal commitment of the developers and companies involved to have no tolerance for mediocrity and to stand behind their products. You either give a damn or you don't. I had a service that processed electronic health insurance claims. 20,000 a night for 10 years. We NEVER lost a single claim because we were nuts about fault tolerance. Why? We used to say "it is someone's paycheck". He, his family, his employees and his patients all depended on us to make sure everything worked. It was hard and it sure as hell was satisfying.
Of course, this will all be moot as the patent wars escalate. Pretty soon I won't be able to pinch my wife's ass because it'll violate some Apple gesture patent.
I'm going to yoga now...
Given the Time-Money-Quality triangle, quality is the first to go. High quality software requires money and time. You need to hire project managers, testers, designers, tech-writers, etc. Companies that just lean on their developers to get as much done as they can are clearly sacrificing quality in favor of reducing head-count, salaries, and time to market. I think the small companies and large companies alike are guilty of this.
That said - why is it that hardware engineering doesn't seem to have this issue? That shiny new iPhone took plenty of competent engineers, but also lots of overhead from project managers, designers, quality control specialists, and much more. Can you imagine if Jobs had just found another Woz and asked him to build an iPhone?
One wonders if the rise of 3D printing, self-fabrication, hobby electronics, etc. will end up corrupting the hardware industry. When hardware engineers stop designing stuff and start just throwing things together because their boss asked them to - they will be exactly where we are today with software.
Also agree that we can do better.
Same with with David Kennedy's comments.
I am amazed how greatly IT incompetence and workforce apathy are ruling in the work environments nowadays. It seems that that IT "solved" their problems giving VIP treatment to their leaders - so do not feel the pain in everyone's asses - and the rest of us are receiving the "left-alone-in-the-cold-night-public-service-like-sorry-I-cannot-help-you" treatment.
But there's something else. A root cause of many bugs is the fallacy that the code behaves like the concepts we have in our head. You see a "person" in the code and you assume that it somehow behaves like a real person, when maybe it's just a first/last name pair that is not good enough to uniquely identify a person. And you end up with contact sync bugs or instant messages going to the wrong window.
Concept programming challenges this core tenet, by putting the focus on the conversion from concept to code, which is always lossy. It gives us a few metrics to identify what might go wrong. With the help of concept programming tool chest, you will quickly realize just how broken something as simple as your "max" function is. By "broken", I really mean that it has very little in common with the mathematical "max". So any reasoning by analogy with what you know from concept space leads you to ignore issues that you should consider (e.g. overflows in integers).
The linked presentation also offers a few ideas on how to build tools that take advantage of these observations to reduce the risk of error.
I use an MBP running Windows 7, WP 7.5, and FireFox as my primary browser. Other than some stupidity with Trillian, my tech issues are non-existent.
also, eliminate monarchies. after microsoft failed to be a benevolent monarch in the 90s, people simply looked for a new king - apple. no more kings. use open source and dive into problems or help those who are diving in. if you rely on the goodwill of a benevolent monarch, its game over. you probably cannot get out of your situation with apple technology.
keep expectations in check. technology is often poisonous to our happiness. limit the penetration of technology into your life.
I definitely don't think you're whining. I've been writing about the decline for years:
What seems to happen is that the problems are really easy to ignore when you're new to software. But after a while you start getting expectations about what quality really means and then you start noticing less and less of it out there. Somedays it makes me want to become a Luddite.
Two quick observations:
1. Your general pain exemplifies why it takes me quite a while to incorporate something new into my technical ecosystem. Before I commit to using one of these wonderful services, I want to largely know how it integrates, what its limitations are, and ensure that it is easy to live without if something goes away or fubar.
2. Here is another instance of the problem: I have tried more than a couple of times to use my Blogspot/Blogger information to further identify my comments, never gotten it to work, and comments I spent five minutes or more typing in get completely lost. Happened with a previous version of this comment too.
1. Developers are discouraged from considering their work art or craftsmanship. Emotional detachment is critical, I agree, but nearly every development methodology siphons passion out of coding in an extremely effective manner. And when developers stop fighting for code elegance, quality spirals quickly. I've never seen high quality code produced when the developers weren't willing to fight management tooth and nail over feature bloat and quirk maintenance. Everything is a tradeoff, but each non-critical checkbox that is introduced generally doubles the number of logic state permutations, and halves the ability to perform complete Q & A.
2. Leaky abstractions are considered 'acceptable' by management. Software capabilities are limited by how tall we can stack the layers. Perfect layers can be stacked infinitely, but imperfections trickle up exponentially. Software innovation has slowed to a trickle, in a nearly logarithmic fashion, and it's the fault of layer quality just as much as the patent wars.
3. Money is one of the least effective ways to encourage creativity and craftsmanship (see "Effective Programming: More than writing code" by Jeff Atwood, great read).
To me, the only strategy for solving the layer and quality problem on a large scale would involve nearly removing the monetary reward and management interference factor. And to be truthful, this strategy wouldn't work on a small scale, either. Only large companies with very diverse software needs would reap significant direct benefit over a traditional management approach. Finding developers who craft excellent libraries and beautiful layers is possible, but dictating what they want to work in is not - you can inspire, herd, and motivate interest/passion, but you cannot dictate it. However, on a large enough scale, (such as at Microsoft, Apple, HP, IBM, etc), it's unlikely any creation of very high quality won't find its own utility somewhere.
1. Locate a large number of software craftsmen and craftswomen; people driven by the desire for perfection, who create elegant and artful solutions to complex problems, and have a good bit of altruism. How do you locate these people, you ask? By browsing GitHub, or looking within your own ranks for dissatisfied, despairing, yet accomplished perfectionists.
2. Calculate the cost of living for each person (and their family), and pay them no less and not significantly more. Eliminate the monetary reward factor. Re-evaluate periodically to adjust for family changes, medical problems, etc. Eliminating monetary stress is just as important. I know this is impossible to do perfectly, but it shouldn't be very difficult to improve on the existing situation. Providing accounting and budgeting services to the developers is an easy way to monitor and manage things. You don't want to make money a carrot *or* a stick; helping them get by with less money is not necessarily a disservice.
3. Promise that any patents derived from their code will never be used for offensive purposes, and will never be sold. License every line of code they create under an OSI-approved license, have an accessible staff of layers in case ambiguities arise.
3. Decouple management as much as possible, with a ratio of 1/8 to 1/20 'handlers' per agents. And use your best managers, people that are rockstar coders and were born with enough people skills to charm a gargoyle. These people are already good at self-management, or they wouldn't be creating high-quality work on their own. But looking 3-10 years ahead, determining what layers will be needed, and inspiring them to work on *those* projects is not a task you should assign to any but your best.
4. Evaluate projects every two weeks, to help keep them on track, or suggest a deviation if they need a break. Handlers serve more as counselors than managers.
5. Evaluate individual suitability for the program every 18 months, or sooner if requested by the individual. Provide an easy path into and out of the program; this will increase employee retention rates and allow developers with good foresight to save the company's collective behind occasionally, and also permit agents with flagging interest to return to the 'standard business environment' without repercussion.
6. Encourage collaboration between agents, but do not require it. Require good code readability, good documentation, and well-focused unit tests, such that an project can be picked up by another developer within 3 weeks. Allow agents to act as coordinators if their projects achieve sufficient popularity and the intrest of other top-tier devs.
This is not a small-scale or short-term strategy, nor one that can be employed on people that aren't already in the top-tier. However, I suspect it will attract a *lot* of top-tier talent that would otherwise be inaccessible to a large enterprise. And I think it would eliminate the talent loss that seems to be occurring everywhere.
Many of the best developers are driven by desire for immortality - they want to write code so elegant, so reusable that it never dies; layers so perfect they are never replaced. Find a way to channel that desire, make that code a possibility, and you can solve a lot of really hard problems with a very small amount of cash and a few high-quality handlers.
A lot of people like the birthday calendar feature. If you don't, it's easy to turn off:
Settings -> Options - > Turn the "Birthday Calendar" switch off.
Some others (Outlook mails stuck in your outbox, indexer working overtime) sound like known issues with running a certain *pre-release* version of Office.
The life cycle of products and companies is drastically shortening. There is less incentive to polish products and more incentive to keep releasing new ones.
Companies get wildly successful or die in 3-5 years. iPhone was released 5 years ago and turned Apple into a behemoth. It doesn't matter if every feature in Nokia phones worked perfectly. Apple evolved and Nokia didn't.
The society doesn't require better software. It constantly requires new things at the price of reduced quality.
It's not a development problem. No one cares, so everything is broken.
Great for merging contacts and removing duplicates. ContactClean
Helps with organizing contacts, groups, merging from Facebook and Twitter and much more. Really a contacts replacement app. ContactsXL
i use two anti-virus programs: MSE and AVIDS (from BELL Canada, a.k.a. Sympatico) from time to time both find the same virus (giving it different names) ... it never goes away ... i'm sure i'd be told i'm guilty for my own re-infections but i'm guessing i'm not -- the virus is NOT today's latest ... it's several years old.
Microsoft Outlook 2010 ... my .pst file i've named "iHateMicrosoft" which says it all ... i was perfectly happy with Outlook Express a.k.a. msimn ... at least for me, Outlook 2010 usually closes and opens cleanly (most of the time). really, the file extension should be .pis for personal information store and also because one almost certainly end up .pis-toff at it.
i'll quit now because i do not want my comment to be longer than your post ...
FWIW, i too feel your pain ... we are not alone! B-(
For example: Apple bootcamp + Windows 7 x64 + Lion: In June '12 I upgraded my MBP to OSX Lion and it silently wrote a new recovery partition over the first several hundred MBs of my Windows partition. I won't soon forget sitting down with an Apple filesystem engineer at WWDC and hexdumping the top of my Windows partition to see it no longer started with the NTFS signature! I later found that this catastrophic data loss bug had been reported many times in Apple Support forums for almost a year before Apple's latest OSX installer wiped out my filesystem. But in that interval Apple apparently did not bother to fix it, did not even deign to add a bootcamp check or warning to their installer. This was not an esoteric scenario, nor did Apple lack resources to catch it in testing or fix it promptly after the first reports. Rather, their actions (inaction) reflects their priorities here.
In such cases, Carl's idea for a central public wall of shame has merit.
For example, you can buy a new computer that will routinely f-up, or for the same $2K, by an old but highly functional used car.
Not only is there no doubt which one will have fewer problems, the problems the used car has will not be catastrophic. They'll just cost money.
On the other hand, a misbehaving computer can trash your disk, overload your network, become a zombie in a botnet, overload peripherals, etc...
As to what the problem is, I'm reminded of a saying I've heard many times in the software development process:
There are four key factors in software development:
It is an inevitable fact of nature that management can only choose three of the four factors.
Nearly every software product I've been involved with has had management choosing the first three factors because budget, features, and deadline are critical to a business's success and easily measured. Software quality is hard to measure.
Tthe more management focuses on budget, features, and deadlines, the less time there is for testing. This reduces the number of known bugs and, ironically, gives the appearance of better quality.
Focus on the first three also forces engineering to skimp on testing, ship products with serious, well-known bugs, delay rewriting and refactoring of code to make it more stable, etc...
As a comparison, my first "real-world" software position was a summer internship for Grumman working on revamping the F-14. More than half of the development time, before a single line of code was written, involved designing requirements, specifying tests to meet those requirements, and writing those tests. The even put an actual cockpit of an F-14, along with a giant computer system to simulate flight, into the testing lab.
Suggesting this level of testing to a software manufacturer would surely evoke laughter. In the computing business, extensive testing seems to be replaced by a mad-rush to the finish -- sometimes even requiring the movement of QA to development to meet deadlines.
While having an exclusive contract and being paid cost+ leads to waste and even criminal activities, it does have its advantages..
It had some issues early on. We wanted rich client-side interactions that a pure thin-client browser could not deliver. We wanted asynchronous requests with lazy responses that http simply could not deliver at the time. But I think when the going got tough, we abandoned our principles in favor of features and quick solutions.
I think we need to start thinking about this whole n-tier problem again from the ground up, because what we have now is not going to work for complex, enterprise class applications over time. Most of all, I think we need a true object compiler for the browser and design, coding, and testing paradigms that cross the server to client boundary seamlessly, so that we can enforce software design quality from end to end.
There are many different areas to look at for improving future work. It's easy to get stuck on the first one, which is the developers themselves. Sure, it's obvious if they wrote perfect code, everything would be perfect, but that's an unreasonable view to assume the fix is you just need better developers, because that's an even more complex problem than saying you just need better software. It needlessly simplifies everything and ignores root causes while providing no real solution.
What we need to ask, is how could we encourage a better average competency among developers, and how can we remove more of the impediments that require developers to not just be good but superhuman.
Training is always an idea, more might help, but there's already enough to show it can't in single handedly solve much.
Being more selective is a good question. Do we really need as many developers as we have? Or would be better off removing or redistributing such that good developers aren't cleaning up messes of green or nearly incompetent ones?
Oh sure, that might help, but who would do this selection? Managers would be the obvious answer, but we seem to have an epidemic failure among software management of the ability to evaluate performance of developers.
The few really good facts we have on what allows developers to perform at their best are ignored not just occasionally, but almost entirely. Arbitrary deadlines are the norm, despite evidence that they always produce lower quality results, and often take longer too.
Fiat design, handed down from management, is the norm. This despite the obvious knowledge that few people will be both expert in design and in managing people, thus leading to a predictable failure in one or both of these inappropriately married roles.
And last but not least, the mixed messages provided to developers on the importance of quality in all aspects. You're very unlikely to have a quality product if you don't care about the quality of your code, about the completeness of your tests, the health and well being of your developers, or your communication practices. Yet over and over messages such as schedule at all costs are transmitted. Even when a message such as quality is simultaneously stated, the damage is done. At best, after some confusing the schedule message is ignored. At worst a team vacilates between one side and the other, constantly screwing up their code, only to then lament at their inability to fix it (and spend a lot of time talking about what's wrong and how they can't fix it, while never actually fixing anything).
It's certainly a pessimistic view on my part, and I honestly hope that I'm wrong and that there is a way out of this mess.
I love this comment and will be sharing it often. Time to get developers in gear and start caring about software quality.
And then there is the "testing is for wimps" retort I heard to chuckles the other night in a user group. Testing in general is still rarer then you might think, and way down the list of things to throw money at. It's all about the money these days. I'm not sure when that happened, but I think it started around 2008.
It's rare that I use a piece of software and don't find an obvious bug in 5-10 minutes that should have been a show stopper.
I compare this to my hobby: cycling. I have extreme confidence in my tools. I've learned to tune my machine to prevent failure. When something does go wrong, I can fix it without taking it back to the manufacturer. Often, I find a problem and learn how to fix it, thus preventing the problem from happening again in the future.
Software is not like this. The user usually cannot fix problems in the software (even in Open Source software). Tools change so frequently that new skills must constantly be acquired and then lost to make room for different ones. I'm in online education so I profit from this, but it still frustrates me.
The most frustrating thing to me is perfectly good features that are removed from software. In Open Source software, there's a need to release new versions that are different from the previous ones, even when there's no noticeable benefit or functional improvement.
I want software that works like my bike. Serviceable, reliable, consistent.
Secondly, regarding the issue at hand, I really think that if you boil this problem down it really has to do with the fact that we write software and software is an incredibly complex system.
BTW, just this morning somebody mentioned to me the cliche about "this isn't rocket science" and that got me to thinking that maybe software development today is as complex as rocket science!
Think about all the software stuff we carry around in our heads and look at all the software books on our shelves (as well as our guilt piles) and really consider that maybe people should modernize the cliche and start saying, "it's not like it's computer science!"
I fear for my livelihood in such a future. :/
Example, Windows 8. I have not met a person who has used it on the desktop who liked it. How are invisible spaces on the desktop that you have to hover over good for usability? Why do I have to hover over the lower left hand part of the screen (and then wait) to see a start tile... why can't they just put the button there. Why do I have to hover of the invisible part of the lower right hand screen to get the charms menu to come up (if my main monitor is on the left, I frequently slide off of it onto the second monitor). This maybe good for a tablet (I love my WP7 phone) but it's horrible on the desktop. Why does alt tab only show the desktop and not all my open apps (if this isn't proof that the desktop is a second class citizen I don't know what is). At a minimum, it should be customizable to metro/vs. desktop view (and I'm not buying usability arguments from MS because I commonly have to go 2 or 3 clicks deeper and out of context to get to common tasks I need).
In summary, Windows failed on the tablet for 10 years because they tried to cram a desktop OS onto a tablet and it was hard to use. They didn't learn their lesson. Now, they're trying to cram a tablet interface onto the desktop. Tablets are all the rage, but seriously, you lose your desktop market and you will be hurting.
- The ASP.NET Development Server with Visual Studio 2010 throws "Out of Memory Exceptions" after 10 to 15 minutes of use on any project (don't have this problem with 2008).
- Outlook stops refreshing mail, requires you to manually enter your password which also doesn't work. You have to close it (and Lync) to get your mail to refresh.
- My source repository (vault) has become slow to the point of being unusable... the recommended solution is to start from scratch and check all my code back in (thus losing the history).
- SQL Express service fails to load on my desktop every couple boots. I have to go into the services and manually start it.
Sounds like you have a lot of Apple-related issues. Maybe it's time to come back to Windows? ;-)
Then provide more examples of things breaking down without mentioning any fixes, and without any comparisons to competing products (which also break down). Then blame software developers.
The result? Lots and lots of comments, and your blog shows up in Techmeme. (Including comments about how we should all be rewriting source code, or switching to Windows. Give me a break.)
I just read Battelle's rant and it is so close to this one, I'm beginning to wonder about this trend. Maybe if enough bloggers do this we'll all start to ignore them and they won't end up on Techmeme.
How about publishing an article about how to fix some of these edge cases?
The problem will only get worse. Gone are the days when one could master C and be a productive contributor for the foreseeable future. The software landscape is now a roiling morass of frameworks and protocols and half-interoperable languages, all with unpredictable life cycles. No one can know enough, and Jacks of All Trades are still Masters of None.
In the same way that casual development results in log files that cannot be automatically processed, so too casual meta-development results in frameworks, protocols, and languages that cannot be automatically analyzed in conjunction, much less in isolation. In a slower world, analysis tools would be scraping the cruft and automating the menial, the weights that make even the smallest software task feel like heavy lifting. And they would be training us in the process to boot.
But they can't much now and they won't much later. Because the problem of software quality is not a matter of complexity, or attention to detail, or the fact that we can't even model heterogeneous systems—beyond their source—to say nothing of analyzing them for "obvious" inconsistencies and vulnerabilities. (In the time it takes to write a useful analysis, the shifting landscape has marked its obsolescence.)
No, the problem is a matter of difficult and unsexy tasks, and what we'd rather be doing. And until that changes, we are left with the status quo: that working software is amazing (and nobody is happy).
That includes focusing too much on schedules and staffing projects with masses of incompetency and allowing no time and providing no motivation to develop competency.
Here is an example: I currently have a problem with my VS 2010 not being able to run a specific asp.net app with the VS dev web server (built-in webserver)--it is just stuck at waiting from response from localhost-- when IE opens. You look at this example and ask well; what are the dependencies? framework versions, vs configuration, service packs (os and vs), environment, MS hot fixes (Kb) on the machine, vs plugins, etc. You check on the web and see some folks having a similar issue--but may not be quite exact, and tried their remedies and did not work. Some folks questions go unanswered and they end up rebuilding a machine. BTW, in my case it is not a problem with the app, because other devs can open the same solution on their machine and it runs fine. This is just one issue, but as you dig deeper, it becomes a wormhole ready to suck you in totally.
Well, the problem is you are giving your $$$ to multiple companies, nobody is happy and want more.
I'm surprised to know that you are allowed to USE Google Chrome, even more surprised that you can even BLOG about it.
The quality of a product depends on the QA team. Developers rely on them to point out their mistakes. So the blame squarely lies on the QA teams.
I heard that Steve Jobs was involved with every aspect of the iPhone. It sounds to me like he was the QA person for iPhone. If we do not like to hear that Steve Jobs was doing QA for iPhone, then we have a problem. QA needs more respect.
I also have a general point to make - we have to stop giving too much respect/importance to the 'process' i.e. software engineering, and start giving more respect/importance to 'people' i.e. developers, QA teams etc. You need great 'people' to build great software. A great 'process' can only be an add-on.
I also sometimes find it amusing that everyone has accepted the fact that the foremost quality that code should have is that it needs to be 'maintainable' i.e. it needs to be written in such a way that, a new developer can look at the code and understand what is going on. That is HR-attrition-economics stuff. Some emphasis should be given to 'maintainability', but we seem to have gone overboard with it. We have to note that, 'maintainability' has nothing to do with 'usability', 'performance', 'design', of an application.
Once the industry has been overturned and settled down, re assess the situation. Then er only blonde hair and blue eyes ..
I can't really add to this, though I will say it's funny to me to read someone actually list their problems with an iDevice. So often people fawn over the things to a level where I wonder why I have an Android phone instead of an iPhone, only to find out it has the same kinds of quirks.
I do remember one head-scratcher a few years ago with OS X, where the home user folder would just keep growing over time even if the user didn't save anything. It was enough years ago that it became a problem if something was taking up an extra 2GB on a desktop machine. It turned out the culprit was Mail.app, and the way it indexed emails; I don't know about today, but a few years ago they used SQLite. It's a nice library--for those who don't know, it 's a SQL implementation meant to be used as a library, and saves databases to a file--but Apple's index files would just grow no matter what. There were a slew of shareware apps that solved the problem of growing index files, of various prices, and they all did the same thing: they opened the indexes and ran "VACUUM DEFS;"
I seem to remember the excuse being that there was a bug that prevented SQLite from auto-vacuuming, but Apple never came up with a good reason why they couldn't vacuum on exit or when the program was idle.
Scott Hanselman's post "Everything's broken and nobody's upset" appears twice in my feed reader.
But, seriously, nice post and I agree. We (developers) have become lazy and our users have become lazy. One problem is that there is really no effective (cost and time) way to report bugs and actually get a fix. Ever contacted anyone from Google support?
Today's moto: "Meh, it'll do."
However, when you choose to be reminded when arriving or leaving a location, it gives you, not the maps app, but the contacts app. It only let's you set a reminder for when arriving or leaving a pre-saved address attached to a contact. It's... I just can't... How stupid is this? I mean, come on! Let me set a reminder for when I drive by the supermarket to remind me to get eggs! There's a GPS, there's a reminders app, there's a map app! Oh well...
The building industry is a joke that makes Scott's problems with software seem pretty trivial. He hasn't died even once as a result. Watch some "Holmes on Homes" and tell me that that industry doesn't have the same problems. There's the same conflict between features, price, quality and speed in both industries.
There's a term for this type of problem but I can't remember it. Grr.
To prove the point of the author, Firefox crashed while I was reading it!
And to your question. Everything is of so low quality because it's a race. Individuals, companies and corporations are not thinking about making things better, they only wish to get ahead of their competitor.
We have reached an enormous speed of software generation (not creation, not development, but generation). You ship it before you test it and it is deprecated before you could go through the first bug reports. This is insanity.
The same actually happens in other industries as well. New hardware is shipped with bugs and becomes outdated before I could read reviews about it.
Basically every piece of software you use is a prototype, a pilot project, it's never going to get finished, because there is no time to make it good. The same with computer hardware. I even wrote about it:
The new culture of entrepreneurship which flourished in the last years polluted the web with thousand of useless apps. Sometimes I wonder who are those people who register with any new service and find time to use them all. How can you be simultaneously on Facebook, Google+, Twitter, Foursquare, Pinterest and dozens of others? Don't you need to sleep or rest? I personally see the current of "apps" going before my eyes and since it never stops or even slows down to glance a closer look at something I sort of let them all go, can't process information at that speed.
With software it's sometimes the size of the thing to be tested, but as often it's just hard to see the problem at all. "uses extra disk space"... so what?
Also, the thing with small software teams is that generally that means the source code is small enough to be understood by one person. So naturally it's easier to spot and fix systematic errors. But I don't think those teams are any more likely than a big team to address larger problems like "sometimes the JRE stops releasing memory, then after a while the device crashes". I don't expect the programmer from some random java app to fix that issue, OSS or not. Even though it only starts when I'm running their app.
Personally what I face quite often is a sprawling codebase that's not understood by anyone who still works in the company. It's big and fragile, and different people have learned about different parts, so any change is fraught. It's like the architecture tour I did recently where the engineering manager said "see that giant nest of pipework? We're ripping it out and starting again because no-one knows which pipe does what any more. The problem became critical when the asbestos insulation on some pipes started falling apart".
Firstly computers are no longer algorithmic. Your computer (device/system) is always in some state. Software interacts with this state and with other software. It's a mess.
Secondly from my experience people seem to care more about looks and cool features rather then quality and raw, everyday work features. At least when they buy stuff. It kicks them later... This is probably because there is a lot to choose from and we don't care enough to spend too much time searching and reading.
At the end of the day, if there are cracks in the walls and the building is wobbly we have to look at the foundations. To a certain extent the fact that so many test artifacts are produced and so much goes into software QA and buggy software is STILL released by enterprises -- bugs that gnaw away at usability until the product effectively rots -- suggests that the way we write software and what we write it on has become a spaghettified mess.
I find the following article cathartic reading now and then, for putting a finger on what has gone wrong with the software world since the 1980s. At least then, in theory, one developer could know a machine to the metal.
While it's true that languages, libraries, and hardware can inhibit development, I suspect that the problem is ultimately one of will and imagination. The great thing about the early days of personal computing is that the future wasn't certain and people were willing to experiment with hardware, with languages, and with architectures to get a piece of it. I don't think the will to go back and reinvent is there anymore to the same degree.
Modules, libraries, and object-oriented programming have enabled us to pile structures on structures to build things -- but the resulting structures are, like Alan Kay has said, more like Egyptian pyramids than Gothic cathedrals.
I don't offer any solutions -- just sharing your frustration!
But you'd better hope that the software flying the airplane you're about to board was developed like that.
Systems are like closets, except they can only handle so much complexity instead of stuff. The complexity in a system rises to it's carrying capacity. New features must be balanced with a reduction in complexity or the removal of older features. When does a system ever drop a feature?
This is probably been stated by someone else at some other time, but the what the heck, let's call this principle the Childs Law in the slim chance someone hasn't said it better. Childs Law: 'Complexity rises until the system fails.'
Simplicity isn't a feature. Reliability is a feature. There is a strong correlation between the two. However, it is difficult if not impossible to add simplicity to a system. One can add features, but one can't add simplicity. It's usually easier to just start over. Of course, then it is hard to call it the same system.
I think the solution is to develop software in Node.js. No more complicated networking and sync problems as it is blazing fast. Also no storage problems as all is in RAM.
Use such open standards, we shall lead the world of software from the mire of conflicting implementations.
All languages suck (although PHP and Perl are complete cluster-fracks in bad design patterns) and anyone who is a zellot for any one language is someone you should worry about. Currently I love working in Python, Scala and Groovy.
Like the skier wearing dry clothes at the end of the day.
Until business model innovation and go to market innovation catch up, do we need to bear some cold falls as users in order to enjoy some of the great runs down the hill?
The ecosystem that we've built up and learned to live with is at the heart of a lot of these things, and I might go out on a limb to say it's the entire cause. It's not one piece of software acting up. Or Microsoft and Apple together on one machine. It's the combined junk from everything all running together co-habitating but not in a nice way.
Take one piece of software. It goes though a lifecycle. Code is written, scenarios are dreamt up, tests are performed. Lather, rinse, repeat. A software release of fair quality is put out there and it becomes integrated into the collective, your computer. Your computer that runs a bevy of other software, drivers, utilities, that probably nobody else runs that combination. Your computer that combined with it's hardware and software produces one of a million combinations where only 10 or 100 combinations were tested in the labs before the software got out into the wild.
Think about the ecosystem of society. Legal, financial, social, healthcare, municipal, federal, real estate, transportation, technology. It's complex. Nobody can draw a picture of the world because it's gotten to be so large and grown in so many dimensions that even when you look at it, it changes.
A computer system running a single OS already starts off as a complex society. Dozens of drivers and services running, interfaces to hardware, memory, SSDs, hard drives, CD-ROMs, Blu-Ray, USB, monitor, etc. Layer on top of that more services and memory resident programs and drivers and applications. Layer onto the hardware compatibility services, messaging buses that communicate to other services and make use of the Internet and all the communications protocols that entails. Layer onto that software that continues to run in the background and all the complexities of task switching, idle processing, sleep and wait states, hibernation and restoration.
That's a lot going on.
Now toss a single wrench in the middle and watch so many of the pieces break, either directly or indirectly.
Now multiply that effect 10 times across your entire system.
You now have a day in the life of your computer. It's no wonder things break as they do. I sometimes am amazed some of this stuff even works at some (and many times it doesn't).
IMHO it's not one thing or even a combination of things that causes this. It's the entire collection of *stuff* and how it all interacts. Many times in ways we have no idea how, let alone debug and fix some of this stuff.
At work, use Java EE (ugh!). I find myself creating stuff that can't work robustly, because of constraints put on by management or poor architecture choices.
On the side, I do embedded work in C with marginally acceptable tools from Microchip. My embedded stuff does work, because it really, really, really has too - but the embedded software in my new 60" TV, every DVR I've ever seen, and my blu ray player was all done by baboons, as far as I can tell.
I do Android app work (Radar Alive Pro is my first), again with marginal tools (the Mac in the house is partly here in hopes that Android device driver issues in windows won't happen on it).
I think that fundamentally, the world is just moving really fast. People want new and more and quality of software is down the list a ways (or, by now, most folks are beaten down by the lack of it and unconsciously just assume everything will be hosed).
One sad thought is that technology management usually imagines that they understand this stuff, and maybe, somewhere, they do.
What's scary is that really important stuff is probably in just as bad shape - infrastructure systems, for example, or banking. How about nuclear early warning systems - you do know that the US and Russia are still at launch-on-warning status, right? And even the thought of software, developed under high pressure, doing high speed trading should make everyone take all their money and hide it in their mattress (currency or gold, your choice).
To go to Start, you just throw the mouse in the corner and click. No hovering. It's the exact same motion you did in Windows 7 or earlier (unless you made things more difficult than was needed and actually make the effort to target the old button). There is no easier thing you can do with a mouse than click a corner of the screen, and it works *anywhere*. You really can't beat the usability of that.
Recently part of a bid team and had to endure the Tasks disappear problem over and over again. Chased all the MS fixes - still happened. And guess what it's still there in 2013 version.
Bugs so bad we started a league table of most hated software. Outlook and Project tying at the bottom just below iTunes and Zune!
First, look at connect and see how much bugs in the tools we use to create software aren't fixed (or will be fixed in the next version) - so we create software using buggy software.
Next, Look at Visual Studio 2012. Microsoft put a lot of resources in the looks of Visual Studio - not in the quality. So marketing came before quality.
And last, look which tools are available in which version. The testing tools are very expensive!
If you want good software, those tools should even be available in the express version!
So you are complaining about software quality...
But software quality begins with the tools we use to create software. And even there, on the field you are playing yourself, marketing is above quality.
Building quality into software is hard & requires passionate people who know the domain in which they are working. It requires diligence and consideration of 'so what if' scenarios, and we just don't have the language constructs to simply specify good code. It's too easy to write poor code, code that works for one given test case but nothing else, and how many of us have spent days/weeks/months reinventing the wheel because of some arbitrary reason (usually because the existing wheel isn't quite the right colour)?
I have had the privilege of working in places where developers genuinely care about bugs and fixing their code, and also watched as this is eroded by the addition of developers who just hack code until it passes a unit test, and managerial practices that don't care about quality at all. If it's not a shiny feature being shipped, it doesn't get onto the roadmap. Fighting tooth & nail to get time allocated to write proper unit tests or fix an architectural problem wears you down when the manager is always saying 'but what business benefit does this give us?'. It reduces motivation when it's more vital to have a pretty interface reinvented every year (*cough*Visual Studio*/cough*) than actual performance bugs addressed.
I understand where they're coming from - today if you're not first to market with the latest & greatest, you lose out, and that has real job consequences. It's just not conducive to good quality software. I guess it's a question of whether we still seek to be craftsmen/women, or just paid to do a job.
It can change. It has to start with the users though. I remember an anecdote (which may or may not be true) about how 3DS Max had become so buggy that the users revolted and told Autodesk 'please, just fix the bugs'. They had two years of releases where they just fixed crashes & improved performance. How many of us would wish for users like that?
On a serious note, Scott you are absolutely correct in noticing and voicing concern about it. Somehow people have just got used-to of expecting apps/website to break just like they expect London buses to be late. Another reason for this is related to tendency of alpha or pre-beta or beta launches of products when they are just not ready for being presented in show.
Solution for this is same old-time quality testing. There are no silver bullets. We need to give due attention to quality assurance and control process in order to give better apps and websites to world.
I just said to my wife the other day, "you say I'm a perfectionist. I say you're used to garbage work from most people. Very few people turn out quality any more."
The argument about scale is bullshit. That's like telling a professional engineer (not some half baked software "engineer") that every second bridge he designs can have a few flaws because he needs to scale his work.
So any complex software system will almost by definition be imperfect to some degree. The issue of how large that degree is one of economics not computer science.
Microsoft (and every other software company) as you know has to balance the cost of fixing bugs against the benefit gained. Microsoft's Connect site sees this often:
We have to prioritize the work we take on based on customer need, return on investment, and the limited resources we have at hand. While we understand the logic and sentiment behind the suggestion, it isn't a feature in our mind that provides sufficient value for the effort and resources it would take to implement.
When are you going to give up on the filth that is Microsoft and move over to the light? :)
That's true. Because Scott primarily uses software from Microsoft and Apple. If he was using OSS, he'd encounter the same number of issues, if not more. I know I have, each time I've used Slackware, Ubuntu, or any other *n*x.
That is to say, when they work, and I can find drivers for all of my hardware devices. It took nearly a decade to get support for my wireless chipset.
Even though I have to restart it every ten minutes because it hangs, it *still* helps me get work done faster than 2010)
Take my Toyota van - wonderfully engineered in many ways, but you cannot stand under the trunk and load groceries when it's raining without getting wet. Take my wife's favorite gum - the packaging requires you to pop the pieces through aluminum which is left sharp and jagged and easily cuts you. Take my flat panel TV - it's great except that the form factor is too small for quality speakers, so there's no opportunity for space saving from a CRT if you want quality sound. These are all engineering problems and UX problems. Software is not unique in this.
Humans generally can't build things perfectly, and that's okay. No really - it's okay. Why? Because we're adaptable and can learn from our mistakes. What we need to do is adopt a mentality of continous improvement - from a design perspective AND from a user perspective. And we also need to lighten up a bit and recognize perfection is not attainable. And the persuit of it, while worthwhile, is never cheap. Lighten up.
I'm a developer. I use Linux Ubuntu, Android and Firefox everyday. I'm happy. Zero problems here. I use lightning fast software on an awesome world.
I have NEVER had a clean upgrade.
This, from a product that is supposed to manage upgrades.
Each new release produces a multitude of failures and requires I manually uninstall first.
It is the level of quality in their products. Have you ever seen a Tricorder reboot, or one of the many great UIs refresh too slowly?
I believe that we still have a way to go, until we have function and quality on the same equal level.
Software is doing incredible things for us. Applications are pushed out to the market all the time, for all popular computing platforms, at a dazzling frequency. This is an age of exploration, pushing the boundaries, redefining information technology. Naturally, there are many quirks and 'oopsies', but the overall experience is far beyond what most of us (computer professionals) anticipated just 15 years ago.
I don't think that it's possible to have this rate of innovation and experimentation, without the kinds of problems you've described. I write "kinds" in plural because the sw industry has many issues - old codebases bogged down by backward compatibility (or simply cases of "Boris no longer works here"), groundbreaking code written by people whose technical skills are not on par with their creativity, the urgency to be "out there" before the idea pops up elsewhere and many other scenarios.
So QA is not sufficient, some developers are not committed to fixing bugs, some companies make calculated decisions to ditch an old user base - sh*t happens. Overall, we're living in an IT heaven, data is unbelievably accessible, social tools make the lives of many people much richer than they'd have been just a decade ago, there's an infinite supply of free entertainment, discussions, creativity tools, art, teaching... really, the odd restart or spinning wait icon are way more than a fair tradeoff.
The other partial reason is that 'developers' and 'engineers' need to think like and for the users who are eventually going to use the product rather than thinking in terms of languages, tools, architecture and elegance - they are important, but user friendliness should be the primary focus.
Third reason could be that technology companies are more interested in business than in technology. Instead of focusing on their strong points and niche products, they are trying to do too much.
That's *not* an easy problem.
Add-in senior engineers attracted by well funded startups or starting their own venture, it becames hard to maintain legacy software and innovate at the same time.
Got to give credit to Microsoft if they can pull off Windows 8.
My Android phone sometimes crashes and restarts. Chrome sometimes crashes and my tabs don't reopen upon restarting it.
Gawd... I've even had Sublime Text 2 crash on me, which is loved by most developers, whence Notepad++ never did so... All of this on an ultrabook with 4G RAM, Core i5, Win 7 and HD3000.
Software is truly broken, as you said. But in my opinion, the true reason is that software is too complex, the number of scenarios increase exponentially with each new feature: o(n)^2. At a certain point it becomes impossible to manually test the program, and automatic testing isn't perfect.
You state you've had a lot of problems with various software, and you also state:
I think it's all of the above. We need to care and we need the collective will to fix it. What do you think?
P.S. If you think I'm just whining, let me just say this. I'm am complaining not because it sucks, but because I KNOW we can do better.
Okay, so you *know* we can do better? How would you make this better? You don't really say.
However, you've listed around 20 defects from, what, just under 20 products? That's no more than 1-2 defects per product, it's hardly what you would call the end of the software developers world as we know it!
There's also the fact that a 'lot' of the problems (as touched on by previous commentary) are down to the amazing number of combinations of running software/hardware, there are probably hundreds of potential things that could go wrong given those combinations.
I guess what I'm saying is that it's not strictly true that standards and testing have dropped across the board, some scenarios are either unreasonable to test (or too expensive to cover) to warrant such an effort.
I would hazard a guess that most software companies either lack the capacity/money or willingness to handle these edge cases, and to be honest I don't blame them, the software I use works 99% of the time, the 1% that doesn't is usually something I can live without or can forgive them for that hiccup.
Writing software today is more difficult than 10, 20 years ago, there's so many platforms, devices and potential pitfalls. In the grand scheme of things I would say alot of the time things work pretty damn well given the circumstances.
There just needs to be a culture focus on developing the *right* way and sticking to it all the way through to a real product.
I asked him what he thought of the term 'fraud', and he didn't understand what I meant. I tried to explain to him that if you think a person who has no training at all and you let him 'learn on the job' for a few months, is suddenly able to write any form of software, you're delusional.
Frans, you are my hero of the day. I couldn't have said it better myself. Another related issue I see all the time is when developers, that may even be really good with Java or Win32, are suddenly dumped into .NET and just keep doing it the way they've been doing it. I'm certain it goes the other way as well.
I can always tell when .NET code has been written by a Win32 dev because everything returns bool and has out/ref parameters.
Oh, and if I see one more empty...
// do stuff...
there will be...Trouble.
Raising kids on Macs and PCs are training them to be passive consumers of crummy software when everyone has the capacity to contribute and be part of a healthy software eco system.
It's called "Intellectual Sustainability"
That is the problem, it should not be getting harder because if it continues we will reach a point of intractability and the cost of that will be phenomenal.
It is scary when you look at the cost in complexity that just adding one option introduces. Anybody that is allowed to define an API or is responsible for new feature analysis should probably be made to sit through an optimization course before they are let lose :) if nothing else that would slow things down and buy us some time to really sort the problem just by the attrition rate alone.
hummm so work for MS sucks too , he he, are you sure you are not going to far with this post ?
Install OpenOffice (any flavour). Open the broken Word document in OO. Add a space, delete that space. Save.
Your Word document is cured.
Using smaller libraries makes iMovie work better too...if You also put videos in iPhoto and then pull them into iMovie. If you have ever had your SOUND disappear in iMovie, that's why ~ it happens when the iPhoto library you are connecting it to gets too large.
Good luck meh!
If one is diligent enough to get it done right first time, one probably lost any market share that was there to someone who got it to the market first with bugs and then started fixing them.
Your second problem occurs when you are using different versions of the Apple Contacts data formats between devices. The most common case comes down to two machines with different versions of the OS sharing a common home directory, but you can also see an interaction with iPhone / iPod / iPad / laptop / desktop for similar reasons. This comes down to the use of SQLite, which is not a real database, and so in general is not arranged to use subschema. Apple has a data-centric model, unlike the Microsoft library-centric model, and so when you need to talk to data, you are talking to an app. In other parlance, it's a lack of separations between model and controller. This is the same reason you can't annotate a song in your iTunes library to say "from 1:35 to 1:55 of this song is a ringtone" without iTunes coming unglued and rebuilding your data indices. Basically there's no graceful upgrade/downgrade for the local copies of the database in the cloud.
Your iMessage problem is a side effect of the duplicate contact information. Resolve that and you've resolved the iMessage problem.
The Outlook problem is specific to the model Outlook uses. Instead of fully validating container objects, which would require downloading the full container, it starts assigning meaning to the data while it's in flight. This has the effect of immediate gratification, since it can start rendering the messages immediately. For content-transfer encoded MIME data, that means two things: (1) Malformed data can crash Outlook, and (2) Security is poor: malformed data can be easily used to attack your system with local code execution exploits. While a lot of work has gone into patching exploits, there are always new plugins and MIME types which result in more problems. External renderers are the solution the iPhone chose; there are other equally valid solutions that would prevent problems, intentional or otherwise, but which are not implemented. Most companies tend to rely on their mail servers to massage the data to avoid the problem, but that's just a spackle-it-over solution.
The iPhoto Photo Stream issue is, I think, a misunderstanding on your part. No photos will be kept past 30 days unless you save them to your local camera roll. What that means is that if you take photos on an iPhone, and then go back and look at them on some other device, there's going to be more photos on your iPhone after the 30 day limit because they are in the local camera roll, but the other devices which only access them via the Photo Stream will only have the last 30 days. This appears to be a confusing point for a lot of people who expect that their photos will instead be replicated to all of their devices ("Synced") instead of merely streamed (Photo _Stream_, not Photo _Sync_).
The Outlook outbox is all about Outlooks broken offline model. Mail.App does both better on this, and worse on other things. Frankly, most mail programs these days are underwhelming unless you are willing to run older programs. If you use Mail.App, for example, you'll definitely want to disable presence notification, especially if you have a lot of mail, since it will be blinking the green dot on and off for all the messages, not just the ones in your viewable scrolling area.
Can't help you with Lync.
The Final Cut Pro issue is running UI in a plugin thread which is not the main thread. You need to seriously consider not running the plugin. Meanwhile you should report this to Apple, since there will be CrashReporter logs. They may not do anything about it (when I was on the kernel team at Apple it took moving Heaven and Earth to get them to stop using a deprecated API), but then again, they may surprise you. These people generally care about good user experience and stable product, when they are made aware of issues.
No idea on your Calendar. Typically this type of thing follows from a Corporate LDAP server that puts up iCal records for everyone in the companies birthday, but couldn't tell you for sure.
iPhoto being unusable with a lot of photos probably ties back to your address book issue: I expect that you are using an older version of the OS on one of your machines. My understanding was this was fixed in a more recent version by doing cell-based rendering so as to not try to render everything, even if it's not user visible in a scrolling region.
The Word documents are probably a version mismatch; the easiest thing to do, if they are not confidential, is to use http://www.zamzar.com/conversionTypes.php to convert them to an older version of .doc. This may not work for you, it really depends on the root cause of the failure. The typical response when you have this issue, if it's in fact the machine you used to write them in the first place, is to do a "Repair Install" of Word, which will fix any plugins and will fix and doc type associations.
Web site browser lockups are a common side effect of adblocking software and of URL rewriting software; make sure you are not using a proxy you did not expect to be using in your browser. The adblock tends to fail if they use an onLoad clause to verify their ad has loaded, and you've blocked it. This is typically a coding error in the adblock software.
And moreover, the whole bugfixing chain is broken:
1) Users have no easy way to report bugs.
2) When they get such a way, they do not use it.
3) When they already report some bugs, these bug reports get lost in some big pool. It is not uncommon to see several years old reports, in both open source and proprietary applications.
I don't see peoples obsession with turning on every setting and customising every detail. People always seem to want more than what something is capable of, just be happy with what something does well! (These people I refer to are family and friends btw)
Also, common sense.. so what if Final Cut Pro crashes while saving if you scroll too fast - scroll slower, then!
Last time I discussed this publicly, I was laughed at, insulted and dutifully trolled out of the discussion...
Now I thought Visual Basic was great and it was easy compared to programming in DOS. I moved on to pure Windows API programming, while the world went crazy about dot.net. Didn't work with another Microsoft language for a decade (I use a non-Microsoft language). Recently tried to play with the latest Visual Studio so I could try out building Metro apps. It was an absolute shock to the system. Definitely not easier ! The minimal OOP in VB for GUI stuff was great. Now everything is an object and coding is a nightmare. Can't see how anyone can write maintainable code anymore. I think the development tools are broken.
How would you feel if you have to take your car to fix its issues to the dealer every month? Unlike software, whenever there is an engineering problem in a car design it becomes a major news headline. You stop buying that car or even that brand.
NB: Myself is a software developer and I believe that software can also be just as perfect if we get enough time to finish it up to our satisfaction just like other facets of engineering.
Interestingly, the bare-to-the-metal software is way better in this respect (e.g., file systems or memory/process management are hardly flawed). But since all the user actually sees is the broken chrome sitting on top of it the overall experience is spoiled nevertheless.
I have not been able to successfully run the DevFabric emulator from the Azure SDK 1.7 in order to debug my apps. Used to work ok in 1.6, then started breaking, so we upgraded to 1.7, still no good, so I upgraded to Win 8 clean install...still no dice.
It keeps thinking my web roles are not running in an initialized role environment, so calls to get cloud settings don't work - e.g. can't get a storage account to retrieve blobs.
1) Deliver software (inevitably with a few bugs)
2a) User reports bug
2b) User asks for new feature
3) User starts asking when new feature will be present.
4) New feature gets implemented.
5) goto 1
Most of the code in a modern application is about how to manipulate those fundamental abatractions into something that can be used by the program; tons of code for parsing raw text, xml, json, etc into objects, tons of code for making programs talk to each other, tons of code for persisting data into organized data stores, tons of code for presenting the data to the user....and within all that, a little bit of code that is actually the application's features.
Steve Jobs said, when asked what was his goal with NextStep, something along these lines: 80% of what each application does is the same from application to application, and he wanted to move that to the O/S, so as that writing applications would become much easier and shorter.
Ditch those and life will be a whole lot simpler.
It's a shame having things you don't "own"...
My own experience is the same - Windows become more and more broken over time. Therefore I restore my system completely every 3-4 months from an image made from a freshly installed system. Sometimes earlier if something seem to me messed up.
Which I believe is why I generally have very few problems with anything, despite my system is over 60 GB with hundreds of programs installed.
I'm looking for a pattern in all these issues. It seems like cumulative data entry has failed. Duplicate contacts, missing space, and failing connections, are all settings and data that became problematic from some other "helpful" data entry system. Maybe you upgraded the program, or ran an import wizard, or whatever, but over time, the data become valid to a schema but not logical to it's intent. I have the Birthday calendar problem too. The issue isn't that it has automatically imported all the facebook birthdays, the issue is I can't remove them easily.
We need more Admin modes or Admin-type cleanup tools. This is basically why we've been mucking around in the registry for the last fifteen years.
And as far as doing something about these issues... We have blogs where we can point them out and hope that others have already found the answers. Now we just need to make sure the vendors read our blogs and all answer their emails.
I believe that this issue is timeless: If I own a cow, the cow owns me. You'll find analogies to this very issue since people started keeping a history.
I know we can do better, too. But we have so many cows today... and where shall we spend our time?
Crazy, crazy, these things.
* Phone: PalmPre3
* Workstation: Dell Laptop w/ LinuxMint12
* Organisation: Paper+Google-Addiction
Palm Pre 3 - Didn't install anything, since I bought it. Just works. Had a Samsung Galaxy before and tuned it for weeks - heck, I even compiled the arm kernel once, in the silent hope to squeeze out some more performance to persuade that laggy system to speed.
Laptop w/ LinuxMint12:
(a Dell Vostro (2years old)) Connected Thunderbird and Evolution as a front-end to my google-addiction. Works and is fast. However, meanwhile I prefer to run a single firefox window as frontend for gcal. I have to admit, that I love linux and tuned many parts (tmux+vim+zsh+many more) to my needs and pushed every single config file to github so I meet a similar work environment at my actual work.
For browsing I use Opera. Breaks at some sites using the "Turbo-mode", while tethering. Tolerable considering the quality and stack of stuff.
At work we have iMacs, which I degraded basically to a terminal to log into our Linux-Cluster and work there (terminal/rarely KDE), where the environment is almost identical to my home env, thanks to git-dotfile-repo. Nevertheless, there is one thing I really envy: iTerm2. I wish someone created such a terminal for the *nix community...
Scott, fix it yourself! You got the brains for it tenfold and this blog-post is a great catalyst. No point in blaming others or the system. There is a lot of trash outside, but there are also diamonds hidden, which sometimes need to be ground by hand though.
If there is interest, I'll create a blog post on my work environment another day.
Would you expect a brick layer to do the plumming on a house?
A lot of the comments about management are things i see regularly and it is frightening sometimes.
I also think outsourcing has a contributed to this problem a lot too.
Its a simple idea really, good teams with good people and reasonable processes tend to write decent software. Bad teams with poor quality people and poor development processes will write rubbish software. Most companies cant measure the difference between these two scenarios so tend to go for the second because its cheaper on paper and easier to sell to their equally out of their depth manager.
In my humble opinion, the cause is corruption. When profits and bribes dominate the system we get the crap we are getting now, and not just electronics. Planned Obsolescence, it has crap built in. We all know that piece of shit is likely to die two days after the warranty runs out.
I think they could figure out where the "twiddle bits" go to make a stable system if not for all the status quo bullshit.
I'm not saying that is works 100% correctly all the time. But at least, when it breaks, there will be lots of people who do care. And if you are willing and have some programming skills, you can even help fixing it.
”I like using open source because I like having the assurance that I can always fix it if I just learn enough, and that there’s nothing blocking my path toward learning it other than time constraints”.
For a rough example; Windows 8 is 'mostly' what Microsoft planned as the next OS after XP but what we got was years of getting incrementally closer: XP sp1, sp2, sp3, Windows Vista, Vista sp1, Windows 7, SP1. While all the releases are being planned by the project managers all the quality code gets shoved aside, luckily some slips through.
It's true that if you held the release for code the programmer was completely happy with nothing would ever be released, all this Agile has to be done right now, work 18 hour days just ends up with code where none of the corner cases ever get tested.
I suggest for reading:
- Responsibilities of Licensing
- Software Quality & Programer Quality
I noticed that my iPhone was filled with old data that I simply could not see or delete. The only way I went around it was to install PhoneDisk by Macroplant.
Now I can see all the files on my iPhone and delete and move files without having to use iTunes.
Reinstalling Windows masks the other crap software you have installed. If reinstalling fixes it, something is breaking it.
Do you replace the engine on your car every time you need an oil change?
The best thing is I can create a new contact from the phone and it immediately shows up in Google Contacts... and vice versa.
I also have my calendaring setup to sync through Google too, and everything stays in sync in iCal of my Mac, my iPhone, and through Google Calendar. I believe much of this can be achieved through iCloud now, but meanwhile I'm sticking with Google... it works very reliably and has been for along time now.
Address the common part of the problem -- "You and everyone else keeps buying and using ... no matter how buggy it is". Anything else is just pointless whining.
Man, please. You are working for a company doing crap from the beginning (and especially in the mail domain), you trust your life to a big shady company, and want some interoperability with Apple.
I'm sorry, but it's obvious you are getting into trouble, your choices are completely braindead.
Something that often happen in a poor environment is some software is built badly. Then the rest of the features end up getting stacked on top of a poor "base" The rest then becomes to expensive to fix so the bean counters demand it be shipped "as is"
Something that tends to contribute to this badly is the fact that it is almost impossible to gather decent requirements for software which are not going to change significantly after its been built.
I don't see too many bridges designed and built for 4 lanes of traffic having a railway stuck onto the side after it has been completed. But software engineers have to put up with this because quite often people who making the decisions cannot understand what they cannot see.
"The Hair Thieves" and "the hair thieves", treating them as two different bands?
Is probably my biggest bug gripe at the moment
"I like this one. The other one was full of disappointment, and made me cry."
That perfectly describes how I feel about a lot of software these days.
Foremost, though has to be these:
1. software development is REALLY HARD. There are a lot of people doing it who aren't really up to it (though they are JUST good enough to produce working code in the minimal sense).
2. our computation model is binary, brittle, and unable to spontaneously adapt to new conditions. This won't change until a revolution occurs. As a result, even well-written and -tested code breaks when it encounters something unexpected, which is guaranteed to happen eventually.
and 3. the pressure to get to market quickly, and hopefully avoid irrelevance, is just tremendous right now, during this period of rapid, disruptive change. Others have covered this topic very well already.
I believe you misunderestimate the amount of cash most of this "Free" stuff brings in, through ads and monitoring.
Alternatives? There are bugs in all of the alternatives I can think of. It depends on how the bugs affect you on a daily basis, but there's no escaping terrible software.
S'like going to a restaurant and having them screw up your order by putting pickles on when you ask for them to be removed, then switching to the restaurant across the street - where they do the exact same thing, only this time with olives.
@Frans Bouma: Great comment!
My application development experience has mostly been for various departments within the State of California. In that experience I've worked on many teams and many applications and nearly all of them have sucked painfully.
At the pace the state workforce moves, we can't even blame time to market like the fast moving private sector might. There are even occasions when a project is stretched across a fiscal year for budget protection, because with the State money unspent is clipped from next year’s allowance.
We simply ignore quality, even security, because most of state staff are the pseudo-devs you speak of. In fact, because I've worked here so long, when I compare myself to the skill levels of my peers and mentors, I often feel like I am one myself.
The State cannot retain truly legit developers because their projects are often lame and driven by back-pocket friendships or legislature, rather than by business use or duty to the public as we've been charged.
It's sad really; and I'm sad for the same reason... because I know we can... and more because our duty is to serve California, because we should.
Replace "legislature" with "Board of Directors," and everything you just said applies to private companies equally.
When you compare the number of people who can with the number in the field you quickly become amazed that anything ever works rather than wondering why nothing does.
We are overrun by complexity. Products are shipped with many, many bugs. They're is no way to fix them all and ship, especially when talking about interaction with other products from other companies. Some bugs are low priority. For others, the mantra is "there's a work around", although its not clear how anyone is supposed to figure out the work around.
At least these days, we can do a search to find others that have had the same problem. Only problem is that we cannot tell which solution will work or which solutions have really worked for individuals. And it's unlikely that a company will admit to a bug that they have no desire to fix.
We need to keep improving the state and tools of software development. And in the meantime, we need to have better testing and better feedback loops between developers and their users.
Re: Alternatives. Not buying is only one alternative. There are others.
You could write your own, start a company and hire some people to write your own, fix bugs yourself, or pay someone else, to fix bugs in an open source equivalant, get a bunch of users to petition a vendor to fix a bug, and so on. And that's just the ones there's already a mechanisms for.
We could invent new mechanisms for other ways that don't exist yet. Want to encourage someone to take on the task of developing an app you need? Eliminate their ROI risk by offering an "X" prize with funds collected from presales. Want to make commercial SW fix their bugs? Get laws passed that force them to accept returns of their SW, one justifiable reason being that it was too buggy to be useable.
Like I said, address the problem -- that people accept buying broken SW. As long as people are willing to buy broken SW, other will continue to sell such to them.
So, when you use the word "literally" to describe something for which you are not literally being literal, you're still literally correct.
Such is the nature of the evolution of language.
Well, I can count indefinitely, you got the point - greedy monkeys is a root of evil.
That's not going to change until customers start demanding quality. I don't expect to see that any time soon.
Why can't I see the folder sizes in Windows Explorer in the same place as the file size is displayed, I need to right click and select Properties.
People have been complaining about this on http://www.annoyances.org/ for many years... but no one is listening. They are just adding animation to the GUI.
2. Microsoft: you don't need the index service - disable anything that you do not need.
3. Apple: See 1. Apple software doesn't work. They only make beautiful devices.
4. Apple: Again see 1.
5. Microsoft: Don't use expensive bloatet software - try Thunderbird (by example) for a change.
6. Apple: Again, see 1.
7. Microsoft: See 5.
8. Google: Why use gmail if you have a normal email client? - Don't use what you do not need.
9. Microsoft: File a bug report - you work for them remember?
10. Apple: Again apple product. See 1.
11. Microsoft: Buy an agenda.
12. Apple: Again, see 1.
13. Apple: Again, see 1.
14. Microsoft / Apple: Doesn't it look good or doesn't it work? - and again the biggest problems on the Apple, so see 1.
15. Google: So don't use it - browsers enough in the world.
16: Apple: See 1.
17: Microsoft: See 9.
18: Microsoft / Google: See 15.
19: Google: See 8.
So, to sum it up:
Apple: 8.5 out of 19
Microsoft: 7.5 out of 19
Google: 3 out of 19
Throw away your apple devices and your life is twice as good. Quit Microsoft and it becomes almost perfect.
And as soon as you get your personal/social life at home instead of in Google, your life will be perfect.
I in no way want to slam you Scott, but lets take this "oops": "I'm am complaining not because it sucks, but because I KNOW we can do better." As soon as it becomes troublesome for that "I'm am" to be corrected it won't get corrected. The business feels the meaning is still there, the functionality is there, the business decides that the risk to correct that is too high... So it doesn't get corrected. I see the same thing happen on a daily and/or weekly basis.
This is why I enjoy working with small development teams with often 1 developer, sure he will make mistakes -- we all do, but we need the lowest possible hurtle to correct those errors. Facebook seems to have gone too extreme the other way; a victim of its own success.
Would you expect to buy a new washing machine/ fridge that had a suitability for purpose disclaimer? Breaks down on third day - "Sorry, mate. No refund or repair. You've been using it for its intended purpose."
I suspect it's part of the old "engineering" attitude.. Better to spend 10 X as much fixing the problem than 2 X as much getting it right to start with.
I've been writing software since the late 1970's so would probably have been sued to extinction if my first suggestion had been in force!
As someone commented above, stay very simple, avoid bloated crap, avoid the cloud,
I'm upset! Have been for years!
But you can change your thinking and realize what you've described is an opportunity. To create new, well-crafted systems that really work, all the time, without the pain. It's not easy, and takes a mix of logic, art and intuition. But the result is something that truly differentiates itself in a way that marketing veneer just can't match.
In the grand scheme of things, software development is still a relatively nascent field compared to other areas of engineering. Just look at the wealth of formal validation techniques used in Electrical Engineering. Our tools are just getting there.
So while I can't give you back your 3 gigs of iPhone space, hopefully I can provide 770 bytes of inspiration.
I think another reason why so many software is erroneous is because changes of all kind always introduce usability issues: Usability is all about meeting the user expectations. Every change you do requires the long time expert users of your software to adjust to the new application behavior, so even if you greatly enhance the usability when looking from a new user's point of view, old users might get confused and feel distracted by the change. Hence it's much easier to introduce new features than to correct design flaws done years ago.
My solution? Continue to sell what you have, while re-implementing it in the background from the scratch, as Apple did with OS X. Just don't be afraid to break with the old conventions and principles - people who don't like the new approach can stay with the old one!
And remember what Einstein said: "We cannot solve our problems with the same thinking we used when we created them."
Recently, I read Jared Diamonds book 'Collapse' and find it notable that when I society is loosing it, that they cannot take enough risk to act on their problems and seem to resort to superstitious behavior. Companies do that too. Bug scrubbing becomes like a religion or like trench warfare, and we cant break out of the loop. I also think that products have life cycles, that entropy rules , and that every code base has a point where the company should just stop, because dinking it is just gonna make it worse.
Comments are closed.
It's a complexity and amplitude problem.
Where a twiddle bit is the difference between a robust app and a system crash, where testing coverage has to enumerate every possible scenario, and where digital means no graceful degredation, we get our current software ecosystem.