Technically wrong pdf free download






















Since , writer, historian and filmmaker Gwynne Dyer has written a widely syndicated newspaper column on international affairs, regularly published in 45 countries. And why is so much of the information wrong? Is it herd instinct, official manipulation, robber-baron owners with ideological obsessions — or just the conflict between the inherently bitty, short-term nature of news reporting and analysis and the longer perspectives needed to understand what is actually going on?

How much misinformation stems from simple ignorance and laziness? How hard is it to get things right, and why do so many people in the media get things wrong?

Author : Peter L. Consider the possibility that humanity might, while engaging in ever more efficient and less expensive modes of computerization and automation, effectively destroy real human economic activity.

So that a point could theoretically arrive when a vast plethora of goods and services would be available for sale, however, the numbers of available purchasers would be constantly diminishing to the point where civility would begin to disappear, theft would become rampant. These shamey opt-out messages do pretty much the same thing to all of us online: they manipulate our emotions so that companies can collect our information, without actually doing any of the work of creating a relationship or building a loyal following.

So there I was, trying as best I could to advocate for the people we were supposed to be designing for: cardholders who wanted to understand how to get the most value out of their credit cards. Halfway through the one-minute video, a person in a gorilla suit walks into the scene and beats their chest, staying on-screen for a total of nine seconds. Half the participants routinely fail to notice the gorilla. In this experiment, a gorilla the size of a matchbook was superimposed onto scans of lungs.

The radiologists were then asked to look for signs of cancer on the scans. A full 83 percent of them failed to notice the gorilla. In this idealized universe, we all keep beep-beeping along, no neo-Nazis in sight.

It cares about gathering your birth date, so that your user profile is more valuable to advertisers. The fluttering balloons are an enticement, not a feature. Delight, in this case, is a distraction—a set of blinders that make it easy for designers to miss all the contexts in which birthday balloons are inappropriate, while conveniently glossing over the reason Twitter is gathering data in the first place.

Facebook has an internal metric that it uses alongside the typical DAUs daily active users and MAUs monthly active users. Tech-industry insider publication The Information reported in early that nudging CAUs upward had become an obsession for Facebook leadership. Uber designed its application to default to the most permissive data collection settings. It disabled the option that would have allowed customers to use the app in the most convenient way, while still retaining some control over how much of their data Uber has permission to access.

The result is a false dichotomy: all or nothing, in or out. Just not yours. Uber is well known for having a male-dominated workplace that sees no problem playing fast and loose with ethics. A proxy is a stand-in for real knowledge—similar to the personas that designers use as a stand-in for their real audience. Here, Google wanted to track my age and gender, because advertisers place a high value on this information.

The problem with this kind of proxy, though, is that it relies on assumptions—and those assumptions get embedded more deeply over time. So if your model assumes, from what it has seen and heard in the past, that most people interested in technology are men, it will learn to code users who visit tech websites as more likely to be male. Once that assumption is baked in, it skews the results: the more often women are incorrectly labeled as men, the more it looks like men dominate tech websites—and the more strongly the system starts to correlate tech website usage with men.

In short, proxy data can actually make a system less accurate over time, not more, without you even realizing it. Digital products designed to gather as much information about you as they can, even if that data collection does little to improve your experience. Because once our data is collected—as messy and incorrect as it often is—it gets fed to a whole host of models and algorithms, each of them spitting out results that serve to make marginalized groups even more vulnerable, and tech titans even more powerful.

The only way to stop perpetuating the bias is to build a model that takes these historical facts into account, and adjusts to rectify them in the future. The questions we need to ask are, Who decided what that desired outcome was? Where did the data come from? And how might that definition leave people behind? Remember, neural networks rely on having a variety of training data to learn how to identify images correctly. Wait a second. Roth notes that this only started to change in the s—but not necessarily because Kodak was trying to improve its product for diverse audiences.

Improving the product for black audiences was just a by-product. It can actually get worse. The problem is that very few people have been talking about this—and meanwhile, because Google released Word2vec as an open-source technology, all kinds of companies are using it as the foundation for other products. So much for machines being neutral. We need to demand instead that the tech industry take responsibility for the data it collects. We need it to be transparent about where that data comes from, which assumptions might be encoded in it, and whether it represents users equally.

By , that gap was even larger: a full 25 percent of black American internet users reported being on Twitter, compared with just 9 percent of white American internet users.

But during all of these product improvements, Twitter built precious few features to prevent or stop the abuse that had become commonplace on the platform. The story was fake, its description was riddled with typos, and the site it appeared on was anything but credible: EndingTheFed.

Yet the story stayed at the top of the Trending charts for hours. Four different stories from EndingTheFed. Facebook did precisely what it had always intended with Trending: it made it machine-driven. Could the Web—the entire flow of American information—come to be ruled by a corporate leviathan in possession of "the master switch"?

When Death has a story to tell, you listen. It is Nazi Germany. The country is holding its breath. Death has never been busier, and will become busier still. With the help of her accordion-playing foster father, she learns to read and shares her stolen books with her neighbors during bombing raids as well as with the Jewish man hidden in her basement.

You might assume that much of the attrition comes from women leaving to start or care for a family. Only about 20 percent of those who quit SET leave the workforce. The rest either take their technical skills to another industry working for a nonprofit or in education, say , or move to a nontechnical position. If the tech industry has acknowledged this problem and says it wants to fix it, why are the stats so slow to change? Or so the story goes. Phillips examined a broad cross section of research related to diversity and organizational performance.

Some participants were assigned to diverse juries, some to homogenous ones. Across the board, diverse groups were more careful with details than were homogenous groups, and more open to conversation.

When white participants were in diverse groups rather than homogenous ones, they were more likely to cite facts rather than opinions , and they made fewer errors, the study found. In another study, led by Phillips and researchers from Stanford and the University of Illinois at Urbana-Champaign, undergraduate students from the University of Illinois were asked to participate in a murder-mystery exercise.

Each student was assigned to a group of three, with some groups composed of two white students and one nonwhite student, and some composed of three white students. Each group member was given both a common set of information and a set of unique clues that the other members did not have. Group members needed to share all the information they collectively possessed in order to solve the puzzle.

But students in all-white groups were significantly less likely to do so, and therefore performed significantly worse in the exercise. And the best way to ensure that happens is to build a monoculture, where insiders bond over a shared belief in their own brilliance. The reality is a lot more mundane: design and programming are just professions—sets of skills and practices, just like any other field. If it did, then the industry would seem normal, understandable, and accessible—and that would make everyday people more comfortable pushing back when its ideas are intrusive or unethical.

Not every tech company looks at the world like Uber does thank god. And the company got there by what so few tech companies seem to bother with: considering their users as real, whole people. No rock stars, no gurus, no ninjas—just people who bring a combination of expertise, humility, and empathy.

Black people accounted for nearly 8 percent of engineers. A good and short read. The problems aren't less worthy to discuss, though.

The sexist and racist culture is so embedded, the privileges so taken for granted, the arrogance and the belief that tech people are coolest and smartest and above everyone else so fierce.

That needs to change. Kathy Reid. A must read for anyone who designs digital experiences, and doesn't want to be an inadvertent dude-bro. Against a backdrop of increasingly ubiquitous technology, with every online interaction forcing us to expose parts of ourselves, Sara Wachter-Boettcher weaves a challenging narrative with ease. With ease, but not easily.

Many of the topics covered are confronting, holding a lens to our internalised "blind spots, biases and outright ethical blunders". As Wachter-Boettcher is at pains to highlight, all of this is not intentional - but the result of a lack of critical evaluation, thought and reflection on the consequences of seemingly minor technical design and development decisions.

Over time, these compound to create systemic barriers to technology use and employment - feelings of dissonance for ethnic and gender minorities, increased frustration for those whose characteristics don't fit the personas the product was designed for, the invisibility of role models of diverse races and genders - and reinforcement that technology is the domain of rich, white, young men.

The examples that frame the narrative are disarming in their simplicity. The person of mixed racial heritage who can't understand which one box to check on a form.

The person who's gender non-conforming and who doesn't fit into the binary polarisation of 'Male' or 'Female'. Beware, these are not edge cases! The most powerful take-away for me personally from this text is that in design practice, edge cases are not the minority. They exist to make us recognise of the diversity of user base that we design for.

Think "stress cases" not "edge cases". If your design doesn't cater for stress cases, it's not a good design. While we may have technical coding standards, and best practices that help our technical outputs be of high quality, as an industry and as a professional discipline, we have a long way to go in doing the same for user experience outputs. There are a finite number of ways to write a syntactically correct PHP function.

Give me form designers, and I will will give you different forms that provide user experiences. And at least some of those users will be left without "delight" - a nebulous buzzword for rating the success or otherwise of digital experiences. Wachter-Boettcher takes precise aim at another seemingly innocuous technical detail - application defaults - exposing their at best benign, and, at times, malignant utilisation to manipulate users into freely submitting their personal data.

It is designing not for delight, but for deception. But they're never neutral. Artificial intelligence and big data do not escape scrutiny. Wachter-Boettcher illustrates how algorithms can be inequitable - targeting or ignoring whole cohorts of people, depending on the unquestioned assumptions built into machine learning models. Big data is retrospective, but not necessarily predictive. Just because a dataset showed a pattern in the past does not mean that that pattern will hold true in the future.

Yet, governments, corporations and other large institutions are basing large policies, and practice areas on algorithms that remain opaque. Yet while responsibility for decision making might be able to be delegated to machines, accountability for how those decisions are made cannot be. The parting thought of this book is that good intentions aren't enough.

The implications and cascading consequences of seemingly minor design and development decisions need to be thought through, critically evaluated, and handled with grace, dignity and maturity. That will be delightful!



0コメント

  • 1000 / 1000