Scott Hanselman

Bad UX and User Self-Blame - "I'm sorry, I'm not a computer person."

April 24, '15 Comments [39] Posted in Musings
Sponsored By

You're evil, little man.In my recent podcast with UX expert and psychologist Dr. Danielle Smith the topic of "user self-blame" came up. This is that feeling when a person is interacting with a computer and something goes wrong and they blame themselves. I'd encourage you to listen to the show, she was a great guest and brought up a lot of these points.

Self-blame when using technology has gotten so bad that when ANYTHING goes wrong, regular folks just assume it was their fault.

My dad got some kind of evil "PC Tech Hotline" on his machine today because some web site told him his "Google was out of date and that he should update his Google." So he did. And he feels super bad. Now, in this case, it was a malicious thing so it would be really hard to figure out how to solve this for all users. It's like getting mugged on the way to your car. It happens to the best folks in the best situations, it can't always be controlled. But it shouldn't cause the person to blame themselves! He shouldn't fear his own computer and doubt his skills.

People now publically and happily self-identify as computer people and non-computer people. I'll meet someone at a dinner and we'll be chatting and something technical will come up and they'll happily offer up "Oh, I'm not a computer person." What a sad way to separate themselves from the magic of technology. It's a defeatist statement.

Get a Tablet

Older people and people who are new to technology often blame themselves for mistakes. Often they'll write down directions step by step and won't deviate from them. My wife did that recently with a relatively simple (for a techie) task. She wanted to record a lecture with a portable device, load the WAV onto the PC, even out the speech patterns, save it as a smaller file (MP3), then put it in Dropbox. She ended up writing two pages of notes while we went over it, then gave up after 30+ min, blaming herself. I do this task now.

Advanced users might say, you should get your non-technical friend a tablet or iPad. But this is a band-aid on cancer. That's like saying, better put the training wheels back on. And a helmet!

Tablets might get a user email and basic browsing and protect them from basic threats, but most also restrict them to one task at a time. And tablets have hidden UX patterns as well that advanced users use, like four-fingered-swipes and the like. I've seen my great aunt accidentally end up in the iPad task switcher and FREAK OUT. It's her fault, right?

 Um....

This harkens back to the middle ages when the average person couldn't read. Only the monks cloistered away had this magical ability. What have we done as techies to make regular folks feel so isolated and afraid of all these transformative devices? We MAKE them feel bad.

There used to be a skit on Saturday Night Live called "Nick Burns, Your Company's Computer Guy" that perfectly expresses what we've done to users, and to the culture. Folks ask harmless questions, Nick gives precise and exasperated answers, then finally declares "MOVE." He's like, just let me get this done. Ugh. Stupid Users. Go watch Nick Burns, this is a 19 second snippet.

I basically did this to my own Dad today after 45 min of debugging over the phone, and I'm sorry for it.

I'm not a techie

When users blame themselves they don't feel safe within their own computer. They don't feel they can explore the computer without fear. Going into Settings is a Bad Idea because they might really mess it up. This UX trepidation builds up over the years until the user is at a dinner party and declares publically that they "aren't a computer person." And once that's been said, it's pretty hard to convince them otherwise.

Googling: Why are users so...and google recommends "stupid"

Even Google, the most ubiquitous search engine, with the most basic of user interfaces can cause someone to feel dumb. Google is a huge database and massive intelligence distilled down to a the simplest of UI - textbox and a button. And really, it's just a textbox these days!

But have all had that experience where we google for something for an hour, declare defeat, then ask a friend for help. They always find what we want on the first try. Was it our fault that we didn't use the right keywords? That we didn't know to not be so specific?

I think one of the main issues is that of abstractions. For us, as techies, there's abstractions but they are transparent. For our non-technical friends, the whole technical world is a big black box. While they may have a conceptual model in their mind on how something works, if that doesn't line up with the technical reality, well, they'll be googling for a while and will never find what they need.

Sadly, it seems it's the default behavior  for a user to just assume its their fault. We're the monks up on the hill, right? We must know something they don't. Computers are hard.

How do YOU think we can prevent users from blaming themselves when they fail to complete a task with software


Sponsor: Big thanks to our friends at Raygun for sponsoring the feed this week. I use Raygun myself and I encourage you to explore their stuff, it's really something special. Get full stack error reporting with Raygun! Detect and diagnose problems across your entire application. Raygun supports every major programming language and platform - Try Raygun FREE!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb
Friday, 24 April 2015 06:48:36 UTC
I haven't listened to the podcast yet so don't know if it came up, but self-blame is a big topic in Don Norman's book The Design of Everyday Things. He gives plenty of examples of people interacting with bad software and hardware and when things don't go as expected, blaming themselves.

I don't believe he has a solution (I'm partway through the book, just discovered it again after reading it for the first time 20+ years ago) - he puts the onus squarely at the people designing the interactions.
Friday, 24 April 2015 06:57:35 UTC
I don't think it's fair to say to UX designers or technical support that it is *their* fault that people have no interest in learning how to use technology.

It's simply a sad fact that being technically skilled is still looked upon by a lot of people in the western world as "geeky" or weird. Look how many people are proud to say that they aren't good at maths or go back 25 years and see how many grown adults happily told everyone that they couldn't program a VCR (one of the simplest "technical" tasks around).

Ultimately, there's no point pandering to people who don't want to learn, especially if it hinders the UX for more technical people or prevents technical support doing their job. Would you expect a plumber to walk you through how to do the repair when he visits your home or would you get out of the way and let him apply his skilled knowledge to the problem without interference?
Kat
Friday, 24 April 2015 08:06:37 UTC
I don't think @Kat's plumber reference is applicable here. Everybody needs some level of technical competence; that's just the nature of computers. I don't need to know how my tap works and exactly what to do when it breaks, but I do need to know how to use my PC, how to use the applications that I work with every day, and expect them to be designed such that my intuition points me in the right direction on how to use them.

We should design our UX's to be clear and unambiguous, so that users using it for the first time can do basic tasks without a massive learning curve; then they can progress with confidence rather than being afraid to do even the simplest of tasks. The more complex tasks should be easily and naturally discoverable with prolonged usage. It's not about pandering to people who don't want to learn, but about making it easier and more natural for them to learn.
Friday, 24 April 2015 09:00:27 UTC
In my experience there is one fundamental difference between a technical and non-technical user. A technical user will just google the problem.

I'll give you an example. If a warning popped up on your screen right now saying you have malware or a virus, what would you do? I'd be willing to bet you would either know exactly how to clear it using your AV software, or even more likely you would just google the name of the virus to know what its deal is and then follow the instructions on how to remove it. A few more examples:

- You want to stop a PDF being printed or edited
- You want to create a screencast of your desktop
- You want to convert a video to divx

I have barely any more expertise than a non-tech on these sorts of problems and many others. Half the time when I'm helping all I'm doing is doing the same google search they could have done and then looking through the results.
Friday, 24 April 2015 10:05:32 UTC
I agree with Simon - my life revolves around googling the answer! I find older people who were never exposed to computers until much later in life are very fearful. My Mum is very hesitant about doing things on her laptop - took ages to help understand that hitting the wrong key will not cause it to 'blow up'. She just got an iPad Air and loves it - I was actually quite impressed at how quickly she both became comfortable using it and talks about it in glowing terms! Sometimes I think they just become overwhelmed by how many options there are - which should I choose. Patience is needed, but sometimes the was has already been lost and it is best to just get them the tablet!
Friday, 24 April 2015 10:52:53 UTC
Saying "some people are afraid of technology" makes it easier to not try. It should be the opposite. We should be trying to reduce the barrier for the folks that will find it harder to try our software, but building in the depth and poweruser capabilities for the folks that want to be adventurous or who started fearful but have grown to be comfortable. Comparing someone that has been using technology daily for a decade or more to someone who has been chased away from it the few times they have tried is a bad comparison.

Building applications that are usable by people with high technical levels or are willing to go learn how to use it before doing the most basic things is the easy button for UX. This is the "you have managed it to make it not suck". It will cause people with lower technical levels to get frustrated, it will cause them to have an even lower tolerance for technology on the next thing they try. It causes me to make sarcastic remarks about your product or service on twitter, despite being highly technical. It limits the number of people that can get benefit from your thing (and put money in your pocket) while, for some, also reducing their self-worth a little more and strengthening their resolve that they aren't computer people.

And this all goes back to Scott's point, because every one of these apps that they come in contact with will increase their frustration level and make them more reluctant to try the next app, or less willing to start learning how to use it, regardless of their starting level of technical ionterest. There are no natural "non-computer people", there are only people we have trained to expect frustration, annoyance, and occasional illogical magic when they manage to press the right 5 poorly labeled buttons in 6 places and receive the result they wanted.

As an aside, I find it interesting that there's this mystical group of computer people that don't mind looking up hard stuff, but draw the line at the hard topic of making technology more approachable to less experienced folks.
Friday, 24 April 2015 13:13:42 UTC
As always, there is truth to both sides. I, as a developer, also hate some of the UIs. If the UI can make a technically skilled person hate it, then it surely won't help people without tech background get into 'computers'. All the devs should think about what kind of people will use their app, design accordingly, and try to make it as simple as possible. But there will always be a bunch of 'lazy users', who will resign after first popup, even if it just said something like "Feature X is disabled and is needed to complete this operation. Please go to Menu, open Sub menu Y and check the box Z".
Miroslav
Friday, 24 April 2015 13:19:10 UTC
As a developer I ask myself "Would my grandmother be able to easily use this application?". Then I strive to build the UX in the most intuitive, discoverable fashion that makes sense for the task that the application accomplishes.

I've had to learn a lot about making the application for the people using it and not for myself. That doesn't mean that I have dumb down the interface, it means I have to think outside of what I know and learn how the users think and provide them with the experience that makes sense.

I find myself apologizing on behalf of other developers to "non-technical" people and try to help them figure out why they ran into a problem or what a certain feature in an application really means. I never take over a task for someone. I don't learn when someone does something for me and I don't expect someone to learn if I do something for them.

I encourage people to explore the settings in an application, especially before they start doing any real work with the application.
Isaac Bailey
Friday, 24 April 2015 13:27:56 UTC
I don't think it's always a downward spiral into total rejection of technology, descending one bad app or machine at a time.

My own parents, by their own admission "not technical people," have actually become self-reliant. They make their own hardware and software purchases with confidence, they don't wait for my wife and I to "bless" sites they can visit, regularly video chat with us on Skype, and are general PITA on social media by themselves. I've stopped being called for computer problems or purchasing advice altogether.

In any case, there is also the possibility of people like us, who built on one technical success after another, culminating in full Hanselmanitude. And that can happen to anyone.
John Dunagan
Friday, 24 April 2015 13:59:11 UTC
If we step back a little to look at things one element of what is happening when people blame themselves for having difficulty with technology is its relative immaturity. It is analagous to the advent of the automobile. In the early days of that technology an automobile driver had the status of today's jet pilot. Now virutally everyone is an automobile driver - getting into a car to drive is a non-event. Eventually everyone will use computational technology without thinking anything of it. As much as those of us who are already comfortable with it may do our best to ease the path for those who aren't there is a process involved in integrating a technology into a society and culture that only time and experience can resolve. In the meantime those of us who cannot will suffer in doing and those who can will suffer in observing.
Ben Pedersen
Friday, 24 April 2015 14:18:13 UTC
I thought it was a great podcast (as usual), and this is a point that I think about a lot since I manage multiple teams of developers who are constantly interacting with the users and customer support.

I'm of two minds on it, but I think it really boils down to the car analogy that you and your guest made in the podcast.

Think about the average person's relationship to their computer in the same way that you think about the average person's relationship to their car (minus the love, in my experience).

One the one hand, your average person knows very little about how to troubleshoot anything in their car. They have a problem, they take it to the shop. Even for regular maintenance tasks that are fairly straightforward (e.g. changing the oil) will lead to a visit to a mechanic. So on non-standard operational tasks (i.e. anything other than driving somewhere and parking), your user can't/doesn't feel comfortable.

You can relate that to your wife's two pages of notes on manipulating audio. If you were to give her instructions on how to change out the air filter in the car, I'd imagine she would have a similar list (assuming she's not - ha, I'm about to do this - a car person). Wow, this analogy is better than I thought.

You could argue that cars are superior to most software in that the basics of operation (again, driving and parking) are incredibly simple and intuitive. I would argue that that's not the case. You have to go to classes and pass a test to operate a car. What cars have going for them is that the UX hasn't changed appreciably in 100 years. This makes it seem incredibly intuitive because everyone is constantly exposed to it and it's the same when you're 15 as it is when you're 90.

My wife is an English professor. She's "not technical" but she's very willing to learn, and the approach I've tried to take with her is that, like learning anything else, the key is some repetition to create the muscle memory that allows you to integrate the steps into some sort of subconscious process (I know there's a term for this, but I'm not an education person).
Friday, 24 April 2015 14:19:16 UTC
Or you could read Ben's nice, succinct comment that says kind of the same thing I did and not need footnotes.
Friday, 24 April 2015 14:27:07 UTC
I don't think the field of technology is that much different from any other field in this respect. I have been watching houses go up from my back door for the past few months, and when I see the skill that the architectural and structural engineers have, I feel stupid. I watch a sport or a video game being played, and if I can't do what they can do, I feel stupid. I try dancing, and I feel extremely awkward and stupid. We, as computer engineers, have a tilt toward technology. We "get it" much easier and quicker than most others do. There's no fault to be placed, just as there shouldn't be any fault to be placed on me when I make a terrible attempt at dancing.

I think the best thing we can do is to realize that there are varying degrees of comfort with technology. We should understand our respective audiences and strive to make their lives easier instead of our own. If we want power toys, we can build ourselves power toys, but it shouldn't be at the expense of our users. We should also give them the same abilities, in a kinder, more comfortable format, so that they do not feel powerless.
Friday, 24 April 2015 14:51:46 UTC
You touch on an important point when you talk about the conceptual model in the mind of the user. Joel Spolsky already wrote about this. So basically, we as developers must make an effort to match the program model with the user model. The program ideally will work the way the user imagines it must work (whenever possible). Many times that's not easy, but whenever I deviate from that I find that it comes back to bite me.
Miguel
Friday, 24 April 2015 15:47:58 UTC
"I'm not musical...," so they never pick up an instrument. "I'm not a public speaker...," so they never address a group. "I'm not athletic...," so they never exercise. "I'm not good with money...," so they never save and invest.

My 5 year old can't even read well yet, but she can operate most of the computerized devices in the house. Literate adults have no excuse.

Bottom line Scott...you're an Enabler.
Friday, 24 April 2015 16:09:14 UTC
When we got our first computer in my childhood home, I took it upon myself (and so did my father, for a while at least) to start learning on my own how the thing worked. I formed friendships with others and we taught/learned from one another as far as how to configure, set up, modify, troubleshoot, etc., in order to make one's machine, one's property, do as one desired. I found a great deal of camaraderie in this for a time, but more importantly, satisfaction, empowerment, a sense of accomplishment, and a growing confidence through facing adversity and finding solutions. I realize that nowadays we want everything to be intuitive and the idea that one could even expect such a thing is undeniably progress! However, I think there is still much to be said for the willingness, interest, and effort to LEARN. I think a user's confidence will begin to build on small successes at first, but what I believe is more prevalent (although I would love to be wrong!) is a disinterest and a dismissive "but I JUST want it to work" or "but I want it to JUST work" mentality.
Matt
Friday, 24 April 2015 16:34:48 UTC
The biggest problem I've had helping "stupid" users is that they don't read what the computer is telling them, although I suspect this is partly because they have a "computer person" helping them who can do the thinking for them.

My typical "helping" will often follow the pattern
Me: "What does the message say?"
Other person: "I don't know, I clicked it away."
Me: "OK, make it show again."
Other person: "It says I need to do x."
Me: "OK, do x."
Other person: "Great, that's fixed it. You're a genius."

OK, that's slightly simplified, but I've "solved" many computer problems by just sitting next to someone, telling them to read what they are being told, and giving them the confidence that they are not clicking the wrong thing (maybe falsely, depending on how much attention I'm paying).
Friday, 24 April 2015 16:35:47 UTC
I feel your pain in dealing with tech support and parents. Numerous times, I've grown short with them for not 'getting' it and just done it myself. And they do write down the detailed list of steps for how to accomplish tasks which I deem simple. I do question, however, how, in the last 10 years of her career, my mom never figured out how a file system works. She worked with computers the whole time, and still has problems with the whole where did my file get saved issue.
Friday, 24 April 2015 16:40:07 UTC
Why do you write this as if it's somehow unique to computers? Have you ever watch someone bake for the first time, and then get frustrated when the dough doesn't rise the first time, or the cake doesn't bake correctly? And then people say, "Well I'm not a baker". Recipes that my mother or grandmother made had all sorts of things left out of them, because they were part of their intrinsic knowledge, and they didn't need it in the recipe. But the first time you bake with that recipe, you find you're missing all those things that someone else assumed was "common knowledge".

In many ways, people self blame with computers because that's how most people react when interacting with something where they don't have common knowledge.

The difference with computers, compared to something like baking, is that people are expected to be more and more computer literate in order to function on a day to day basis, where other skills like baking are not deemed necessary for the average person.
Friday, 24 April 2015 17:24:45 UTC
I've done that "Install random program that turns out to be malware" before. Quite recently actually and I even consider myself to be a tech person. I was trying to find an old program and ran into a bunch of those "We've got the file you want, just get our download manager" sites. I skipped a bunch of them but after a few hours of no luck I finally clicked yes to one of them and then spent another few hours cleaning it up.

I blame myself for it, after all I'm the one that clicked "Yes". The thing is though, I understand what I did. I know why it was dumb and now I know to never trust those sites. That's the key I think. Instead of trying to teach people about computers we need to teach people that computers can be understood. Get people to think "I can figure this out".
Kyle
Friday, 24 April 2015 20:27:42 UTC
It's pretty shocking how difficult it can be for me to use an iPhone. I'm very good with computers. I'm a programmer.

I needed to upgrade my iCloud storage. Apple gives away only a pitiful 5 GB free and the phone was nagging me that it could not complete a backup.

Here are the steps: (1) See the nagging dialog box complaining about cloud storage space. (2) Tap Buy More Storage. (3) Choose an upgrade, then tap Buy and enter your Apple ID password. (4) Get a prompt for payment and enter all your credit card information. (5) Enter your Apple ID password. Again? Okay. (5) Get a prompt for payment and enter all your credit card information. Wait. Why? (6) Realize that this is broken and that nobody at Apple ever tested the scenario that an iCloud customer was not already an iTunes customer. (7) Poke around the icloud.com website looking for a way to do it from a web interface. Fail to find anything useful there. (8) Get the idea to pre-enter the payment information and associate it with your Apple ID before trying again. (9) Open the iTunes Store app and look for a way to manage payment. Fail to find anything useful there. (10) Go to Settings > iTunes & App Store and look for a way to manage payment. Fail to find anything useful there. (11) Use Google to find the "Change or remove your payment information from your iTunes Store account (Apple ID)" instructions at support.apple.com. (https://support.apple.com/en-us/HT201266) (12) On the Home screen, tap Settings. (13) Tap iTunes & App Store. (14) Tap your Apple ID. Sign in with your Apple ID. (15) Tap View Apple ID. (16) Tap Payment Information. (17) Change your information and tap Done. After you change your payment information, the iTunes Store places an authorization hold on your credit card. Thanks. (19) Use Google to find the "iCloud storage upgrades and downgrades" instructions at support.apple.com. (https://support.apple.com/en-us/HT201318) (20) Go to Settings > iCloud > Storage. (21) Tap Buy More Storage or Change Storage Plan. (22) Choose an upgrade, then tap Buy and enter your Apple ID password. (23) Done.
Zack
Friday, 24 April 2015 20:34:23 UTC
The "Get a Tablet" advice *should* be good advice.

I like the idea of a safe, walled garden. I want to live there. But nobody has yet made one that is good enough. And the app stores aren't doing a good enough job keeping out the impostors and phishers. See this How-To Geek article: The Windows Store is a Cesspool of Scams

That core concept of sandboxed trusted applications that communicate with the system and other apps only through strict contracts and extensions should be the future of computing.

But iOS, ChromeOS, and Windows RT have been disappointing. The multi-finger swipe and button press/tap/hold gestures are often unintuitive and undiscoverable. They don't work on and with lots of different hardware. The one-and-only-one app store model has huge flaws--especially for internal line of business applications. (Let me choose my own gatekeepers. I should be able to designate which few sellers I trust and then download from only those places. Example: only Microsoft Store, Steam, and my company's IT department.)

I'm disappointed that Windows RT is dead. It seemed like maybe Microsoft actually wanted to eventually make it versitile enough for everyday computing.

This xkcd comic touches on how I feel: Troubleshooting
Zack
Friday, 24 April 2015 20:45:01 UTC
Just like when improving a relationship, there are general rules to follow for good UX design. These have not changed since I started teaching them in the 1980's. They are roughly in order of importance. A lot of the following can be found in Ben Shneiderman's books (search a bookstore for "designing the user interface" and related books).

1) The customer is always right. (Many of the comments contradict this, but reality supports it.) When a user input does not match what any of the expected inputs, then the UI should make reasonable suggestions and provide an Introduction and Detailed help (usually as a moderated wiki). If the program was not prepared for a user response, it is the developer's fault, not the user's fault. Search engine spelling suggestions are one step toward this.
1a) Provide prompt about how to get started/continue if the user makes no input for a fairly long time (such as 60 seconds). Never assume that the user has abandoned the product.

2) The UX must always remain friendly. It is not the user's fault that a step was skipped (such as saving a file in the example in the main article) or that an extra step was added. The file saving example should have been worded more like "Would you like to save this file (name) now?".

3) Make it hard to cause a catastrophe. Deleted files and partial edits should be stored for later in a background (invisible to normal users) version control system (with self cleaning when there is a lack of space).

4) Show results as the user enters changes or commands. Provide for almost infinite ability go back (undo). Only request confirmation of a change when the change cannot be undone.

5) Treat users as individuals. Provide for multiple user experience levels (at least, beginner, experienced, and advanced). Experience levels include experience with the technology, experience with the application area, and experience with the product.
5a) When a user first uses any product, begin with reasonable defaults and assume that the user needs prompts to get started. Each prompt should have a "skip this next time" option.
5b) All commands should be available via an easy to use menu system (Microsoft still hasn't gotten this right). This is required for both new users and those who have disabilities. When a keyboard is available, always make pointing available from the keyboard.
5c) As the user becomes more experienced, suggestion prompts could automatically be suppressed.
5d) Provide keyboard shortcuts, movement triggered commands, and advanced menus (Advanced commands and Options) for more advanced users.

6) Minimize modality. Very few commands (like reset this device to factory settings and delete all user data) are "system modal": it does not make sense to do anything else until the user replies to this prompt. Only commands that cannot be backed-out should have a modal confirmation prompt. The rest of the time, users should be able to ignore the prompts and just keep working.

7) Use direct entry whenever possible. The spread sheet and the fill-able form are the most successful software interface styles.

8) Use breadcrumbs. Use a small part of the screen to provide a way to move between active tasks. Remember that the top level active task is selecting a task (the home screen). An analogue of this is (unless the user requests it) never blank the screen completely or turn off the device, just because the user hasn't made an entry.

9) Separate UI testing from UX testing. Test the user interface by both scripted and random inputs from an automatic testing system. Test the user experience by observing (and getting feedback from) users of various experience levels.
Friday, 24 April 2015 21:50:52 UTC
Here's what Windows could / should do to improve the situation:

1. Encourage and help all downloadable software to be in the app store. I'm talking everything: all open source software (anything on Tucows, download.com -- without ads --, etc), free software from competitors, paid software, for tablets and for desktops... everything. If it runs on Windows, it should be in the store. Do you want to install FireFox or VLC (not just Metro) or OEM drivers/software or your company's application? Go to the app store. It should be the go to place for all software installations.

People shouldn't even consider using the browser as an installation medium. You never think of doing this on your phone, why would you on your desktop? Now, this would require a lot of legwork from Microsoft. It would have to go after software that's currently installed via the browser and convince them to move to the store, but I feel the future of the OS is at stake; it has to happen!!

2. Teach users on how all this works. Do you know how a video game teaches you about how the game works? The OS itself should have "training" sessions on how this is done. Make the store app prominent and part of the OS -- not just another app. It should be as prominent as the volume changer or battery indicator.

Attempts to install software from the browser should get caught and redirected to the store (for most popular software and where false positives are low). If they search for a piece of software from the "Start menu", that is not installed locally, it should search for it in the app store and offer to install. There are many other subtle ways this could be achieved.

3. Microsoft should maintain a list of "bad actors" and prevent people from installing from those sources. I've seen cases where ransomware was sent to someone's email (Outlook 360) and then the user installed it (cost quite a bit of money to get the data back). This should not have happened! This was a pretty "popular" ransomware. Why did Outlook let it through?

If you want a better OS, which I think you do, please lobby internally for these... Time is running out and people have less and less patience for these kinds of things especially when they don't have to worry about any of this on their phone/tablet.
Peter
Saturday, 25 April 2015 08:04:11 UTC
People are saying can my Grandma use this ... well one better.

http://theuserisdrunk.com/

Saturday, 25 April 2015 09:47:07 UTC
Without taking the conversation too far off the original point; I think there's a wider aspect to the UX than just software. I feel as though the solution to a lot of user interaction scenarios is to add a touch screen and some software. An approach we've built towards from digital watches to MP3 players, PDAs, smartphones, tablets, etc. and now it seems to be the default position for everything from home appliances to elevators and car dashboards.

I think now is the time that we can start bringing new ways of connecting the physical and digital worlds to the mainstream. Things like haptic feedback, different display technologies like e-ink, voice and gesture interactions, and harnessing the massive computational power we now have for NLP.

All of this, and some creative thinking and design, can create interfaces that are more familiar, intuitive, and kinder to our bodies.
Aston
Saturday, 25 April 2015 14:43:07 UTC
Once upon a time those who could read books without moving their lips were believed to be in league with the devil. Reading was then a rare and elite skill. Mathematical skills beyond simple arithmetic were likewise an elite skill. Yet these days everyday skill levels in both these areas approach what were once seen as elite skills.

More recently driving a car was a rare and elite skill. And you had to be somewhat of a mechanic to use one as well since they broke down often.

The computer, or tablet, or anything else you can install software on is by nature a general purpose device. And learning to use one is a skill much like reading, or mathematics. With language you can express and capture any semantic idea and with mathematics you can express the underlying nature of the universe itself. And yet we take years to gain the skills to use written language and mathematics.

Perhaps the ultimate single use computing device is the calculator. They all look broadly the same and at least since reverse polish notation has fallen from favor they all work the same.

And yet, to use one I need to know significantly more than the "UI" of my calculator. To someone who is mathematically literate it makes sense. This is a key point. A user needs a second set of skills to interpret the UI.

The rub here is that any application, no matter how well designed and written requires the user to have additional and often non-trivial skills to use it.

With today's UIs we have by and large standardized on a UI language even across platforms.

But there are multiple way to express a operation in the language. Do I use a menu or a button or a slider or a swipe? Do I use bough or branch or twig or foliage to describe the characteristics of the tree I see in my yard? Interpretation of the UI is going to take understanding of the language of the UI. Unless we want to limit ourselves to the UI equivalent of "See the red ball. See Spot run! Run Spot run!"

Let's take it as read that there are bad UI's out there. And these days there is no real excuse for that. But the UI discussion always goes deeper than that.

In my view many of the complaints I see about UI design (other than plain bad UI's) comes down to insufficient domain knowledge. Take a concrete example. Aircraft flight planning. Even the best flight planning application will be completely opaque to anyone who has no knowledge of aircraft operations. To the outsider it appears to be the worst UI ever. But to the operations guy it makes perfect sense and he loves it. It cannot be self explanatory.

And any UI that approaches self explanatory will be unusable in other ways, like trying to write a paper on nanotechnology in simple language. "See the atom. See the probe. See the probe move the atom. See the atom dance!" Painful isn't it?

Using e-mail takes some external knowledge about how e-mail works. Like using a calculator requires understanding of how to add 2 numbers together. Without that the operations have no context. Heck, they used to teach letter writing in school. It was not assumed that because you could write you could correctly format a letter, address an envelope correctly, put a stamp on it and drop it in a mailbox. Why then do we assume e-mail should be somehow intuitive merely because we can type?

To be sure there are badly written UIs. And malware is no different than the con-men that used to go door-to-door looking for easy marks. Only the technology has changed. It's a fact of life and it cannot be solved through technology, unless you can never install anything. Awareness of that has to be taught, like "stranger danger" taught to kids. Your "Google being out of date" is the modern equivalent of "Do you want to see some puppies?"

So what does this all really mean?

We have always learned skills from our mentors. We do not magically learn stuff without putting in effort. We didn't just learn to read or drive a car because the book was well laid out the controls were well designed. Both those skills are built on a foundation of other skills that we were taught by our mentors. Thus it always was and thus it always will be. Using technology is like learning to read. It takes effort and work to get good at it and you need the domain knowledge to use it effectively. Outstanding domain knowledge can overcome poor UI design. Excellent UI design will never overcome poor domain knowledge.

Thus it always was, and thus it always will be. Eventually technology will not be an elite skill but a common skill, like read and writing are today. All it takes is time.

Earl Harding
Saturday, 25 April 2015 15:58:27 UTC
In these situations I tend to self-deprecate my work because the person using it matters more. They are the reason I built it in the first place.

"No way! This whole design here is all my doing. Probably a good idea if I keep it a bit dumb. Don't want a Skynet situation on our hands. But seriously, I do want to make it better. It's a constant work in progress and I value your feedback. What actions did you immediately, instinctively think to do to accomplish what you want to do?"

That way the UI is no longer personified as 'smart and flawless' or given any kind of merit or social weight over the user. There's prior social and pop culture context to any kind of modern technology (eg 'Skynet'). You only need to remind the user that it is the same as building a car or any other 'meatspace' tool or mundane object we all never assume to be 'smart' like a computer.

Tarak
Sunday, 26 April 2015 08:19:02 UTC
Well it all works out when Cortana evolves into 'Her' :-)
Edward
Sunday, 26 April 2015 08:37:11 UTC
A bit more seriously. The sentence: 'Even Google, the most ubiquitous search engine, with the most basic of user interfaces can cause someone to feel dumb.'

A bit harsh but at this point I would say. Give it up. Really.
Edward
Sunday, 26 April 2015 17:24:04 UTC
I think the self blame point is important. When I've taught classes some students would get frustrated and blame themselves. This would hinder them from moving forward. Think of the English language and all the exceptions to the rules and the weird variations (artifacts of the history of the language). Imagine blaming yourself and thinking you'll never "get it" when you misspell phone as fone. Could or should the help system be designed to let people know it is probably not them? Yes I expect some people will scoff at this thought, maybe label it as patronizing and give it no thought. This is just meant as a thought seed, not a completely mature idea. The devil is is so the details and who knows maybe a cloud based program using Machine learning will one day provide more targeted help and explanations based on skills the user has already aquired.
Greg Frick
Monday, 27 April 2015 00:35:04 UTC
At the end of the day, UX implementation *IS* a zero-sum game. Every time you make an application "Easy to Use" or "More Accessible," you're removing some power from the user, be it to customize or troubleshoot.

Tablet apps are the big, modern example of "Easy to Use" gone too far -- they don't expose any errors to the user, even when they're something fixable by the user, so you're left uninstalling the app and reinstalling it, losing some or all of your data in the processed, in the hope that *maybe* some magic will happen.
Anon
Monday, 27 April 2015 11:24:18 UTC
This is what Bruce McKinny says about error messages in his book "Hardcore Visual Basic" :-)

"Put up a dialog box telling users they are filthy, unwashed idiots with ugly children. Tell them they have just committed error -2147211501 (&H80042713). Display the message "I am unworthy!" and make them click OK. Then terminate the program, throwing all their wretched, unsaved work into the bit bin."

At least our error messages could be much better if MS made the Win7 "Task Dialog" usable in .NET.
https://msdn.microsoft.com/en-us/library/windows/desktop/bb760441%28v=vs.85%29.aspx
Monday, 27 April 2015 12:45:40 UTC
My house. My wife. Every day.
bill
Monday, 27 April 2015 13:16:29 UTC
> In my recent podcast with UX expert and psychologist Dr. Danielle Smith the topic of
> "user self-blame" came up.

If only you had listened to her before the release of "Modern" UI and Visual Studio 2012...
ThomasX
Monday, 27 April 2015 13:23:54 UTC
This always reminds me of a saying

"UX is like a joke, if you have to explain it, it isn't very good"
Thursday, 14 May 2015 19:24:48 UTC
This is really true, I actually heard the podcast with Dr. Smith.

I think it's worth looking at the other side of the coin. Think about how bad software used to be and how much less we as developers focused on the UX than today.

In my opinion, Apple shifted the focus to UX in a big way and I think things are improving in that regard.

We still have a long way to go. Many people do not find software accessible even today. But just think about how many more people are using software effectively today compared to 20, 15, 10 or 5 years ago. Huge segments of the population are getting more and more effective with software.

My dad who is in his 70s and is the opposite of a techie, can use Facebook, read news and check his stock account. 10 years ago, he couldn't do any of it.
Tuesday, 19 May 2015 20:27:09 UTC
A lot of times when someone says "I'm not a computer person", what I'm hearing is a humble-brag: "I'm much too important to learn the trivial technical details that you computer people deal in. My life was meant for deeper things." It doesn't make me want to help them, because I know that anything I try to teach them will not be absorbed, and I'll end up doing it all for them. And all the while they'll be discounting my technical knowledge as something not worthy of their time.

Bob Clark
Thursday, 02 July 2015 17:12:16 UTC
<sigh> another person who knows everything...
Joe Marker
Comments are closed.

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.