Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Books Education Programming IT Politics

New Book Warns CS Mindset and VC Industry are Ignoring Competing Values (computerhistory.org) 116

So apparently three Stanford professors are offering some tough-love to young people in the tech community. Mehran Sahami first worked at Google when it was still a startup (recruited to the company by Sergey Brin). Currently a Stanford CS professor, Sahami explained in 2019 that "I want students who engage in the endeavor of building technology to think more broadly about what are the implications of the things that they're developing — how do they impact other people? I think we'll all be better off."

Now Sahami has teamed up with two more Stanford professors to write a book calling for "a mature reckoning with the realization that the powerful technologies dominating our lives encode within them a set of values that we had no role in choosing and that we often do not even see..."

At a virtual event at Silicon Valley's Computer History Museum, the three professors discussed their new book, System Error: Where Big Tech Went Wrong and How We Can Reboot — and thoughtfully and succinctly distilled their basic argument. "The System Error that we're describing is a function of an optimization mindset that is embedded in computer science, and that's embedded in technology," says political scientist Jeremy Weinstein (one of the book's co-authors). "This mindset basically ignores the competing values that need to be 'refereed' as new products are designed. It's also embedded in the structure of the venture capital industry that's driving the growth of Silicon Valley and the growth of these companies, that prioritizes scale before we even understand anything about the impacts of technology in society. And of course it reflects the path that's been paved for these tech companies to market dominance by a government that's largely been in retreat from exercising any oversight."

Sahami thinks our technological landscape should have a protective infrastructure like the one regulating our roads and highways. "It's not a free-for all where the ultimate policy is 'If you were worried about driving safely then don't drive.'" Instead there's lanes and traffic lights and speed bumps — an entire safe-driving infrastructure which arrived through regulation." Or (as their political science professor/co-author Rob Reich tells the site), "Massive system problems should not be framed as choices that can be made by individual consumers."

Sahami also thinks breaking up big tech monopolies would just leaves smaller "less equipped" companies to deal with the same problems — but that positive changes in behavior might instead come from government scrutiny. But Reich also wants to see professional ethics (like the kind that are well-established in biomedical fields). "In the book we point the way forward on a number of different fronts about how to accelerate that..."

And he argues that at colleges, just one computing-ethics class isn't enough. "Ethics must be embedded through the entire curriculum."
This discussion has been archived. No new comments can be posted.

New Book Warns CS Mindset and VC Industry are Ignoring Competing Values

Comments Filter:
  • Maybe? (Score:4, Insightful)

    by Aighearach ( 97333 ) on Monday November 08, 2021 @06:37AM (#61967817)

    Lots of nice words, but what exactly are they talking about? Ethics? CS and IT don't choose the ethics, the MBAs do that?

    • Re:Maybe? (Score:4, Insightful)

      by Junta ( 36770 ) on Monday November 08, 2021 @07:19AM (#61967907)

      "I'm just following orders" defense?

      If you willingly work with a company and willingly do work that you feel is ethically questionable, you should feel responsible rather than saying 'oh well, the execs are the evil ones, what can I do'.

      For me personally, I'm making about half of what I could have made by declining a job I disagreed ethically with, though it was very much legal and common. Of course when half of potential income is still very comfortable, it's an easy price to pay.

      • This. Notice your words aren't the kool aid. OP is +3 "insightful" for saying a shallow comment. You haven't been nodded yet and argue the general ethical questions. People want to wash their hands and move on with life, it's our cultural way which I would argue even has religious roots in our culture.

        More to the OP, if no one works for those MBAs and companies, they will have a hard time producing a product. Right now it's a good market for employees and little of the discussion seems to focus on the ethic

        • I generally agree with what you are saying, but I don't think it is always possible. Right now the economy is great for engineers. Everyone can afford to be selective about what work they take. But when the economy is bad and you have a family to take care or aging parents that need help things aren't so easy.

          I'm old enough to remember the 2008 crash. I don't remember hearing of very many people turning jobs down.

          • by whitroth ( 9367 )

            I'm a sysadmin. Let's see, my last contract had ended in early '09, living in Chicago, which I really like, and I had the choice between third shift for a stock trading company out in the 'burbs, or relocating to the DC metro area to work a long-term contract for the NIH.

            Let me just say 'hi' from the DC metro area.

        • People want to wash their hands and move on with life

          In my experience, CS folks don't even wash their hands. Should we talk about that before the broader ethical concerns?

      • Depends (Score:3, Insightful)

        by JBMcB ( 73720 )

        On what you are developing, and you don't necessarily know what that might be for.

        A relative of mine develops Instagram filters. Is that a fun feature that people use to have a good time, or does it drive children whom shouldn't be using Instragram to the platform? Or neither? Or both? How would you know?

        Let's say you work for Twitter and you are asked to optimize a database insert. Is that to cut down on a performance regression, or make the existing platform more scalable, or are they going to try to hoov

      • Re: (Score:3, Insightful)

        by fazig ( 2909523 )
        Depends.

        I have worked on a 'jamming resistant' optical inertial navigation system (fiber optical gyroscopes) in the past. Error drift was minimized to make them accurate enough over distances like intercontinental flight.

        Could that potentially be used in an ICBM that is launched starting the nuclear apocalypse? Sure. But was it my decision to use it for that application? No.
        When I worked on that thing the intention was to create a small and light inertial navigation system with no moving parts to be us
        • by Junta ( 36770 )

          Sure, there are scenarios where the products application may be obfuscated and the developer reasonably doesn't realize a potentially questionable application of their work and in that case, sure the developers in that case can be blameless, so long as that is not willful ignorance where they don't 'know' merely because they expressly don't want to know.

          However, I think most of the common things (like say harvesting personal data of people with whom you have no business relationship) it's hard for the devel

          • When I worked for a company that was harvesting personal data for advertising, everyone at the company had some justification in their mind for why it was ok. Things like, "We are just trying to show people ads that they want to see." And it is true, if I only saw ads for things I wanted to buy, I would prefer that.

            Eventually I ran out of justifications that I believed in, and quit. No more advertising work for me.

            • I had a coworker who was extremelly adamant that he never use adblock. He said that he WANTED to see the advertisements, otherwise how else would he know about new products. He didn't seem to mind that he kept seeing the same stupid Chrysler ad over and over, or that the web pages took many seconds to load or that his machine was slowing down from all the crappy javascript infesting it. And he never even worked for a web based company based on advertising (possibly just naive, he was kind of an oddball i

        • by SirSlud ( 67381 )

          When I was getting my engineering degree, we had to take ethics and values class specifically *because* engineers are highly involved in the things they develop and how they *might be used* and you can't just wash your hands of that. You may not have control over how an invention gets used, but you can certainly try and take into account the good and bad ways it might be used and it may influence what you actually develop.

        • by jvkjvk ( 102057 )

          >You can't hold engineers responsible for every little thing that it's being done with the technology they help to create.

          Right. If that were the case then nothing, ethically, should ever be developed (because anything "could" be used unethically). It depends on the situation, not the technology to determine if something is ethical.

          On the other hand, I was working for a company that was thinking of doing business with China in the early 90's. (Think 1989 Tiananmen Square protests and massacre). I cou

        • You can't hold engineers responsible for every little thing that it's being done with the technology they help to create.

          Very true. A hammer can build a house or it can bash someone's brains in. A pencil can write a love letter or you can stab someone in the throat with it. (Most?) everything invented can be misused. You can't blame the inventor for that. As you mention, if we stopped making things because they can be misused, we wouldn't even have rock tools.

      • If you willingly work with a company and willingly do work that you feel is ethically questionable, you should feel responsible rather than saying 'oh well, the execs are the evil ones, what can I do'.

        I agree completely! People leaving evil-decision making to executive is unacceptable! If we integrated evil-doing into teaching CS then I think we could crank out a whole generation of Mark Zuckerbergs to maximize our dystopian future. Just think of the profits! ;)

      • If you willingly work with a company and willingly do work that you feel is ethically questionable, you should feel responsible rather than saying 'oh well, the execs are the evil ones, what can I do'.

        The flip side of this is that the person that presses the button shouldn't privilege their own views over the inputs of everyone else.

        The obvious example is, right now, there's presumably some fighter jet pilot zooming around, and if they one day decide "you know we have an ethical duty to bomb China for their

        • > his position in the cockpit doesn't make his input more (or less) valuable than everyone else that doesn't have that position.

          It does though. Because his is the input that actually makes the decision. No general, president, etc. has that power, it lives only in the men and women who actually carry out their orders, or not.

          We're all sitting around able to have this nice civilized conversation today precisely because on several occasions the soldier who literally had the their finger on the nuclear-mis

          • I mean, the system of ensuring that the decision to go to war or to make peace is only made by a representative government seems to me far more legitimate than just allowing some individuals to make the decision for everyone else by individual fiat.

            Making the individual responsible is an excuse -- a cope to deflect away from our shared responsibility to make the right collective decisions. We're human, we'll fail at that sometimes (or even often, but I think we're doing better) but it's ignoble to try to sh

            • I don't disagree in principle, but name one country on Earth where major decisions are actually made collectively.

              Hell, our once-great USA is probably the best-represented of the major military powers on Earth, and was long held up as the bastion of democracy. But even ignoring corruption, poor voter turnout, and rampant propaganda... best-case it's still a small minority of the population that actually gets well represented: roughly half of the population votes for the losing candidates, and thus gets n

              • I don't disagree in principle, but name one country on Earth where major decisions are actually made collectively.

                You moved the goalposts all the way from "by a representative government" to a fucking commune? Wow!

                You really didn't know that representative governments are in fact the ones who make decisions about when to go to war in much of the world? You're amazingly stupid.

                • Who said anything about a commune? That's an economic arrangement, not a political one.

                  I'm just pointing out that a genuinely representative government should represent the will of it's entire population, not just a tiny slice of it. And that there is currently no such national government on the surface of the Earth. At best they represent the will of a small fraction of voters, and in practice virtually all of them only really represent the the will of a small group of political heavweights who game th

                  • Who said anything about a commune?

                    You did
                    You were probably trying to be clever by substituting the wrong word.

                    a genuinely representative government should represent the will of it's entire population

                    No, a representative government represents the people who elected it.
                    There is no such thing as a "will of it's entire population." There is more than 1 idea of how things should be run. That is the very problem that representative government solves.

                    • Not quite - a representative government is supposed to represent the entire electorate, not just the minority whose preferred candidate won.

                    • False.

                      You're just introducing an impossible as a straw man, apparently without noticing you did not invent the idea of representative government. You don't get to redefine what "representative government" means.

              • I don't disagree in principle, but name one country on Earth where major decisions are actually made collectively.

                There are none because Democracy doesn't work. It never has and it never will. At least not until there are some requirements for voting beyond "I made it to 18".

                Democracy is a shit-show when it is actually used.

                • Please enlighten me on where it's actually used.

                  I hear lots of "mob rule" scare-mongering around any mention of any kind of real democracy, as though you could get a mod to proceed through the polling place in an orderly fashion. But it's never accompanied by any real-world evidence, usually just platitudes from those whose personal power is undermined by democracy.

                  • Please enlighten me on where it's actually used.

                    .

                    Not interested in educating you. You have access to the internet. Go look it up yourself.

                    2,000+ years of history and there has never been a stable democracy. They ALWAYS devolve into dictatorship. There certainly are countries that have _more_ democracy than others, but every functional government I can think of is a Republic.

                    My question for you would be: Do you really think people would even be interested on voting on every piece of legislation that comes down the pipe? That's what you have to do

                    • I would say *every* government that's been around long enough has always devolved into a dictatorship.

                      You seem to be thinking of what I would call a direct democracy, and in fairness there are those that claim that it is the only form of true democracy. I would agree that they're unwieldy at best. But neither do they necessarily devolve into dictatorships. Take Athens, widely held to be the earliest Classical democracy, and which had aspects of both direct and representative democracies - it remained st

                    • I would say *every* government that's been around long enough has always devolved into a dictatorship.

                      I would agree with that. But Republics tend to last longer. Pure democracy devolves pretty fast. Most hippy communes didn't make it thru a whole decade.

                      You seem to be thinking of what I would call a direct democracy, and in fairness there are those that claim that it is the only form of true democracy.

                      Yeah. That's why I used the terms Democracy and Republic to differentiate.

                      I would agree that they're unwieldy at best. But neither do they necessarily devolve into dictatorships. Take Athens, widely held to be the earliest Classical democracy, and which had aspects of both direct and representative democracies - it remained stable for 170 years until conquered by Macedon, almost as long as the US has existed.

                      170 years... Not exactly enough time to see what would have happened. Nice cherry pick. Using 245 years (Independence) for the US, well... We're nowhere near where we started out. (Left/Right is irrelevant to this) 245 years in and we have Executives who have WAAAAY more power th

                    • We certainly have come a long way... arguably mostly in the wrong direction. Though counting the age of the country from the declaration rather than the ratification of the Constitution seems a bit dishonest - at the time of the rebellion it was not at all clear that the alliance would outlast the rebellion to form one nation rather than thirteen. Heck, it wasn't a sure thing when they met to hammer out a constitution after the war.

                      I have mixed feelings about the 17th - it opened the door to its own probl

                    • I don't agree with everything you said. But I appreciate the level of thought behind it. I'll give you credit for that.

                      Cherry picking nothing - when your assertion is disproven without even looking past the very first first recorded example of democracy, that's a pretty compelling argument that you've overstated your case.

                      I disagree. 170 years is not enough time. Neither of us knows what would have happened. What we have seen is LOTS AND LOTS of democracies devolve into dictatorships. Even assuming Athens would have made it, that's still one case. Every other Democracy I can think of has gone dictatorship or is headed towards it.

                      Even with the benefit of hindsight I'm not sure it's entirely clear that they replaced bad with worse. In retrospect it certainly seems to have reduced state's rights - but for a while at least they returned the Senate to actually representing the people of those states, rather than just the most powerful corrupt politicans that had risen to power in them.

                      I maintain the job of the Senate was to represent the individual STATES, not

                    • Aw %$#@!... I just accidentally "backed" a nice wall of text into oblivion. Don't think I've got the energy to type it again tonight. Well see if a condensed summary strikes me later.

      • If you een know what's going on. Most people in an organization won't see any ethical issues even if the company is blatanly screwing over the customers, the public, everyone else. Often the sharp turn towards being evil happens after 99% of the product line is done. What in Facebook would a rank and file engineer ever see that indicates they're doing something wrong? Someone may say "I'm trying to provide cheap and affordable internet accessibility to emerging third world countries" but behind the scen

      • "I'm just following orders" defense?

        "This road can be used by anybody! If somebody bad drives down this road, it is the road-builder's fault!"

        It's stupid.

        No, it isn't the "just following orders" defense. It is the "that isn't even the part of the thing that this person did" defense.

        They love it in Italy; throw people in jail for not predicting an earthquake. But blaming people who have nothing to do with it is not Virtuous or even ethical.

    • by AmiMoJo ( 196126 )

      I think what they are trying to say is that there need to be more ethical considerations in tech. There is a tendency is develop technologies and services without considering the potential harm they can do.

      Facebook would be a great example. I doubt anyone really considered the psychological impact it might have, and certainly not its ability to interfere with democracy and act as an effective tool of malicious actors.

      • Ethics are tough; morality tougher still.

        Thinking things through isn't what we do as a culture, generally. There are a lot of dubious technologies, privacy issues, out-and-out robbery/theft/fraud, and plentiful money spent in protection.

        The moral issues of bias in AI/NN, non-legal tax avoidance, plainly silly stuff like NFTs, disinformation-by-bots, and so much more aren't handled in policy or democracy, let alone enforcement of what little law exists.

        Empires built by the tech titans rule us now. But even a

      • >I doubt anyone really considered the psychological impact [Facebook] might have, and certainly not its ability to interfere with democracy and act as an effective tool of malicious actors.

        Umm... maybe not up front, but we're talking about the company that *specifically* ran experiments determining how to most effectively manipulate their users emotional states. Not to mention left all the same election-manipulating tools in place after it was conclusively demonstrated how they were used to manipulate t

        • You're conflating the job the programmers do with the job the business executives do, apparently only because Zuck originally wrote some code?

          The outcomes they studied have to do with the details of what configuration values you set, it's not the part of the technology that the programmers are responsible for. The programmers create a system that has those configuration values; it could just as easily be used to make sure that the outcomes are more positive!

          It's the executives and other managers who are mak

          • Take a good hard look at the business world around us. Are you honestly going to tell me that there wasn't always a really good chance the tool would be abused like this?

            As the person creating the tool, you have a responsibility to take off your optimistic blinders, take a good hard look at how it will likely be used, and decide if your creative effort is likely to bring substantially more harm than good into the world. There's always other employers. Other projects where your skills could be put to less

            • That's the stupidest argument you could make.

              Look at the violence and crime in the world around us. Are you honestly going to tell me that there wasn't always a really good chance somebody would use a hammer to be naughty?!

              Is that actually a legit argument for blaming the blacksmith for having made the hammer? For accusing him of being complicit in its misuse?

              Stop being such a blatant idiot. If you want to blame the programmer, you have to show that there wasn't a legit use for the tool. Not that misuse w

              • The question is not whether bad things will be done with it - obviously they will.

                The question is whether, on balance the bad things are likely to significantly outweigh the good.

                For a hammer obviously not - there's plenty of other things that will be as good or better as a club, but nothing that comes close for hammering nails.

                For an automated surveillance and propaganda delivery service that also lets you sort-of-connect with your friends and their friends? It's not quite so clear cut.

                • See, now you've fallen into braindead hyperbole. You can't even have a reasonable discussion. You start spewing hyperbole so quickly, so frequently, you're no longer able to tell it from reality. When you try to think about things like ethics and responsibility, you accidentally put your hyperbole onto the ethical scales instead of people's actions. I say you're guilty!

                  • Hyperbole? I assume you're talking about my surveillance and propaganda comment? That's not hyperbole, it's the literal truth.

                    Facebook, Google, and the like surveil every click you make on their pages. Look at the address shown in the status bar when hovering over off-site links from Google results or from Facebook posts - notice the fact that they actually go to the parent site rather than the supposed target? The *only* reason to do that is to ensure that even if you've disabled all browser scripting

    • Lots of nice words, but what exactly are they talking about? Ethics? CS and IT don't choose the ethics, the MBAs do that?

      With such a nihilist attitude, you are part of the problem as well. Sure, CS and IT don't choose ethics, but guess, MBAs aren't the only ones. A lot of the people behind the current shenanigans at FB or Google are engineers and scientists themselves.

      Ethics starts at the individual, and there's a point when we must ask ourselves if we want to work at X or Y company, if the compensation and trill is enough. Very often we do that when environments are toxic, but seldom we do when the product or service is it

      • With such a nihilist attitude

        If you didn't listen to me, you can't argue me.

        All you can do is spew bullshit.

        Holding the correct people responsible is not what "nihilism" is.

        Find a fucking dictionary.

    • by sinij ( 911942 )

      Lots of nice words, but what exactly are they talking about?

      They are talking about tech no longer being about just code, but instead having to consider its impacts on the fabric of society. It turned out that changing the world for better through tech is much harder than anticipated.

      • Duh.

        But they didn't remember to consider: Who is the person who considers the impacts?

        Is it the person who writes the specification, or the person who implements the specification?

        For example, the programmer includes the gathering of telemetry data as part of the development process. It goes into a database. They no longer have any control over that data. They do not have organizational authority to control who else gets that data. They are not even in the loop to know who else gets that data.

        Why is it fa

    • As someone with an MBA, where I took the classes post Enron. The curriculum for each of the classes included ethics. They thought ethics, not morals, you didn't have a solid state of commandments on what is right and what is wrong, but taught you to think about the wider effect of such decisions, and understanding how short term consequences for a long term plan, still has an impact which needs to be addressed.

      For example, Improving workers efficiencies, and bringing in tools such as automation and robot

      • As someone with an MBA, where I took the classes post Enron. The curriculum for each of the classes included ethics.

        So what? Nobody said that MBAs are unethical because they didn't know, that they just accidentally were assholes. The accusation is that they choose to be assholes, and they already know better.

        I haven't heard anybody (other than the MBAs who get caught for something) suggest that maybe education is the answer. Punishing them in a way that harms their business interests, and harms the business interest of the people who looked the other way. That's where the solutions live.

        in teaching CS and other classes, ethics should be taught as well

        It is. But the relevant ethics are

    • >Lots of nice words, but what exactly are they talking about? Ethics? CS and IT don't choose the ethics, the MBAs do that?

      Of course CS and IT chooses ethics - every time they take an action or build a tool that has an ethical implication. The boss can say "Do X", but *you* are the one who actually does it. If it's unethical, find another boss.

      I double-majored in CS and engineering, so I think I've got a bit of useful perspective.

      In Engineering you're taught that you must *always* consider the ethical i

    • by jvkjvk ( 102057 )

      >Lots of nice words, but what exactly are they talking about? Ethics? CS and IT don't choose the ethics, the MBAs do that?

      So they chose *your* ethics for you? Hmm. No thanks!

    • they don't have a well-thought-out opinion. They just took some of the hot issues of today (privacy, AI racial bias, job loss) and blamed programmers for it. They blame Stanford for "big tech’s relentless focus on optimization" but "relentless focus on optimization" is what Harvard does, and has taught for decades, not Stanford CS or Silicon Valley. It also doesn't make sense.

      I tried to find what exactly they want to be taught in ethics classes to make things better, but I couldn't find that anywhere.

      • Presumably if they convinced the management to have better ethics, they would want the changes to be optimized.

        A lot of people don't really understand that collecting data is not like shooting somebody and saying your were "following orders." It's like blaming steelworkers for the existence of guns. "But you knew they could make guns from steel!"

        • Presumably if they convinced the management to have better ethics, they would want the changes to be optimized.

          True point.

    • CS doesn't even really choose the design either. Someone up top says "build a device to let us take over the world, and with nice easy push buttons". That is, CS as a shortcut for engineering jobs involving computers, as oppoosed the CS the science. So maybe their message is for the CTOs of the world? Because having done this for 30+ years I can only recall a couple of times when my input was wanted for any "competing values". Even at Facebook I suspect that Zuckerberg was the only one who ever mattere

  • by Anonymous Coward
    I want students who engage in the endeavor of building technology to think more about security. Enough of having Android apps connect directly to fully-open SQL Server databases on the internet using sa credentials. Enough of storing passwords and other PII in databases in plaintext. When are universities actually going to start including security considerations in their CS101-level courses?
    • Security is a CS topic. It's just not lower division stuff, as it's hard. Not talking about buffer overflows and stuff, but real crypto security stuff involving higher level abstract mathematics. I certainly learned crypto as an undergrad, eons ago, but it's rarely central and not more than a few days worth overall, enough that when you encounter PKI you can understand what it is and how pre-shared keys are really naive.

  • Sounds like those ill-equipped types who want to tell others who learned a craft how to practice it.

    We are specifically writing code to shift the balance. We write things you dont like and instead of writing something better mr scholar, you can only complain that our talents are serving us instead of you....

    No how about you remember the part where society outcast the nerds and geeks. The part where we were mocked, told technology was stupid, etc... now that we grew up and were right its kind of our right to

    • Re: (Score:3, Insightful)

      by drinkypoo ( 153816 )

      All I hear is a lot of petulance.

      The problem isn't from what nerds create on their own. It's from what they create for money because they need to earn a paycheck.

    • Sounds like those ill-equipped types who want to tell others who learned a craft how to practice it.

      It seems he is actually saying there are ramifications to the code you write and understanding the ethical considerations is important.

      Or, as Tom Lehrer put it so eloquently:

      Don't say that he's hypocritical

      Say rather that he's apolitical

      "Once the rockets are up, who cares where they come down?

      That's not my department!" says Wernher von Braun

      No how about you remember the part where society outcast the nerds and geeks. The part where we were mocked, told technology was stupid, etc... now that we grew up and were right its kind of our right to use our powers that you could have developed too if not for mocking us.

      Here's the deal. If the industry doesn't clean itself up someone else will do it to them in the name of cleaning it up. Both sides of the aisle have grievances against

    • instead of writing something better mr scholar, you can only complain that our talents are serving us instead of you

      The book is literally about the big tech companies (i.e Facebook, Google, Apple, Amazon, etc.) and how their practices of surveillance capitalism, use of biased algorithms and general disregard of human rights in favor of profit is bad.

      If you think any of those things is serving you, then you've drunk more of the corporate Kool-Aid than you should have.

      Ethics dont exist in computers. Its just people mad at the kinds of programs I can write with all my freedom.

      You are free to give Jeff Bezos and Mark Zuckerberg all the rim jobs they want and call it freedom, but the rest of us would rather not keep contributing to

    • I don't know why people are shocked tech is misused, and the growth of pro troll farms attempting to steer political conversation.

      The term "astroturfing" precedes computers, as companies would pay people to stand around in bars holding new drinks like Zima, or New Cool Vodka, and before that, think tanks to pay people to generate policy papers and appear as talking heads.

      Go watch Citizen Kane. It's been going on a lot longer than anyone's been alive.

      "Really, Charles! People will think..."

      "Whwt I tell them

      • The term "astroturfing" precedes computers

        That is remarkable, since Astroturf wasn't invented until the 1960s. I remember Astroturfing becoming a term for propaganda in the early 1990s.

    • Ethics is phony baloney in all spheres of human activity. In the case at hand, it's just a ploy to sell books.

      The universe has only one rule: don't do what doesn't work. Society adds another real rule: don't do what you can't get away with. Then society adds a whole lot of phony rules because that real rule is just too embarrassing to own up to.

      I used to be bitter, now I just don't care.
  • ...code responsibly! /end sarcasm
  • Very funny (Score:5, Insightful)

    by TheNameOfNick ( 7286618 ) on Monday November 08, 2021 @06:58AM (#61967861)

    Sahami also thinks breaking up big tech monopolies would just leaves smaller "less equipped" companies to deal with the same problems

    Or sad. Depends how you look at it. Loyal dog doesn't even need to get paid to shill for his master: Telling people to mind the consequences of their endeavours, and at the same time making sure the monopolists he helped create can run roughshod over all concerns because they're too big to regulate.

    Break up the big five. The smaller the better.

    • by dfghjk ( 711126 )

      My reaction exactly. Now that he has got his, others must apply resistance where he did not, and his team, likely still profiting him personally, should be free of any constraints on their power.

      "Less equipped" companies are what we want. Government needs to regulate against the worst results of capitalism. He doesn't understand this, ignore his bullshit.

  • The problem is.. (Score:5, Insightful)

    by Junta ( 36770 ) on Monday November 08, 2021 @07:10AM (#61967891)

    The people aren't oblivious to the ethical implications when they do it, they just don't care. Therefore a book seeking to point out that there are ethical concerns targeted toward the people who already know full well there are ethical concerns isn't going to do much.

    There are people who are mindful of this sort of thing in the industry. The problem is the business side very much rewards the more questionable behavior over the carefully good behavior. So you get to have high income without legal consequence with perhaps questionable ethics vs. more strongly ethical career with more modest income, but no better from a legal perspective.

    If you feel that ethics are poorly handled in the industry, sadly you must get some specific regulation to reign in the offenses you think are prevalent rather than appealing to the entire employee pool and hope you get all of them.

    • The people aren't oblivious to the ethical implications when they do it, they just don't care. Therefore a book seeking to point out that there are ethical concerns targeted toward the people who already know full well there are ethical concerns isn't going to do much.

      You start somewhere. You start by documenting an issue and making an argument about your take. Whether it has an effect or not, that's irrelevant *at that particular point.*.

      There are people who are mindful of this sort of thing in the industry. The problem is the business side very much rewards the more questionable behavior over the carefully good behavior. So you get to have high income without legal consequence with perhaps questionable ethics vs. more strongly ethical career with more modest income, but no better from a legal perspective.

      If you feel that ethics are poorly handled in the industry, sadly you must get some specific regulation to reign in the offenses you think are prevalent rather than appealing to the entire employee pool and hope you get all of them.

      All of what you said is true, but that does not preclude us from elaborating a case against the status quo and suggest changes.

      The alternative is to just throw our hands up. That might be ok for some people, but I can guarantee it is not for others.

      • by Junta ( 36770 )

        I'm just saying directing at the employee pool first and regulatory action as a potential afterthought is a bit backwards.

        In the disciplines where you do have professional standards with principled behaviors as integral to the career, there's a legal framework backing them up where companies seeking profit through unethical practices would be opened up to civil and/or criminal penalties. In the CS neck of the woods in the areas I presume the authors are worried about, they can bring bad press down upon the

    • The problem is the business side very much rewards the more questionable behavior over the carefully good behavior.

      And this is a social problem. As the USA pushes further and further toward a purely capitalist society, ethics will continue to fall by the wayside. Wealth is an increasingly louder voice in government and legislation is increasingly being written to give ever more power to corporations.

      The sad part is, a big chunk of the population is cheering this stupidity on because they've been brainwashed into thinking that any regulation is impinging on their freedoms even if that regulation hurts them and only hel

      • by ganv ( 881057 )
        We do need to move past the the simple minded extremes on the balances between laissez-faire innovation (driven by free choices by businesses and individuals) and centrally regulated innovation (driven by central government laws and bureaucracy). As you say, there is a brainwashed right that celebrates no regulation. Hopefully you also can see the brainwashed left that imagines they are smart enough to regulate innovation in ignorance of the sordid history of centrally controlled economies degenerating in
    • by ganv ( 881057 )
      This problem of technological advances allowing the creation of social systems and choices for humans that don't end up with optimal systems for society is an old one. The early industrial revolution, the automobile revolution, and now the digital revolution all have followed a similar trajectory. There are many others As Junta notes, these books don't help much. It is quite easy to identify that it would be good if entities (individuals, companies, governments, standards bodies, etc) that understand
  • should climb out of their ivory towers, and go out and home the homeless and feed the food-less for a while.

    They don't seem to be adding much to better humanity.

    They could also go to work in the industry and try to change it from within. I'm sure they would be almost as popular as union organizers.

    • The guy helped build Google. He might just have some valid insights about surveillance capitalism and biased algorithms based on his hindsight on what he helped build.

  • Want a new CS industry where tokens ALL get the jobs ?

    Then you should have taken CS101 instead of Wimmins Studies, shouldn't you ?

    Now give me my fries.

  • by JustWantedToSay ( 6338930 ) on Monday November 08, 2021 @08:15AM (#61967987)
    For Economics majors, do we embed ethics through the entire curriculum? How about Chemistry? Physics? Agriculture?

    Ethics should remain an elective for a reason--it's not a settled subject. Should we teach Virtue ethics? Utilitarian ethics? Deontology? Antiracism? What is the confucian perspective on P=NP? If we teach all of the major approaches in each class, will we ever get around to talking about compilers in a class called "Compilers 101"? I say this as someone whose undergraduate degree is in philosophy. I'd love to attend a seminar called "The libertarian utopianism of BitCoin" but that should be separate from a class on "Advanced Topics in BlockChain".

    Putting ethics central in a curriculum where the primary interest has little to do with ethics will present an opportunity to capture the moral attention of that world that will be too tempting of a target for those who have an ideological axe to grind. I just don't trust our increasingly polarized educational institutions to present the subject in a meaningful or useful way that won't cause more problems than it's attempting to solve. Besides, most of these concerns aren't issues of coding or architecture but of business models. In other words, the ethical problem (such that one exists) is with MBA programs. I suspect that one reason for targeting CS to carry the ethical torch is because it tends to lean liberal and so they would be more receptive to the types of ethics being pursued as opposed to MBA programs which tend to be more conservative.
  • Not wanting to disrespect Mehran Sahami but the blurb seem to be self referencing.

    Allow people think about new concepts, strategic bargaining power, economics, finance, marketing and many other things by communicating well. The art of communicating concise and terse very few people master.

  • In capitalism, companies compete. Those who are less successful at competing are replaced by those that are more successful at competing.

    It's not about being "good" or "evil". Just competition.

    While tech companies follow this guideline as well, it's not limited to tech companies. "News" outlets may find that misleading stories result in more viewers/readers than accurate stories, and thus can become successful while spreading misinformation. Entire fields of alternative "medicine" spread misinformat

  • by ET3D ( 1169851 ) on Monday November 08, 2021 @08:47AM (#61968057)

    All industries face the problem that they're done for profit. Over the years everything from food to construction to drug development proved that money often comes before ethics.

    Blaming a "CS mindset" for it is silly. A lot of CEOs and board members of tech companies don't even have any CS background. They are just business people. Yet they are the ones who make the decisions about product directions. Managers often make decisions that go directly against what lower level technical people would want.

    Sure, government oversite is one way to go, and there are other options, but wrapping it up in a way which suggests that CS is the problem has little to do with reality and is pure propaganda, which I feel is trying to take away the blame from those who make the actual decisions.

  • Ethics has literally no place in capitalism*, which is the system these people will be working in. If it sells, then it will be built.

    Embedding ethics in a single curriculum is not enough, we need it to be embedded in society. We need to be more prepared to tell people "no", and to back that up with penalties and punishment for those that cause damage. But none of that will get past capitalist leaders.

    *I know Adam Smith wished upon a star that it did via some sort of emergent behaviour magic pixie dust, but

  • Maybe the CS mindset will result in too many TLAs being used, resulting in an information OD when presenting to a potential VC source.
  • In other words, what this guy wants is an authoritarian dictatorship, like China. No thanks.

    • by dfghjk ( 711126 )

      No, he just wants others to not do what he has done because he does not wish to suffer like the people he victimized. He's clearly not advocating for authoritarian government, he is advocating for total lack of government constraint in favor of personal responsibility, but only for you, not him.

      Google was ethical in the beginning...and smart in what they were doing. I'm sure he doesn't mind the personal profit from the downturn in their ethics.

  • The concerns are reasonable, but I think the problem is much bigger - it's a problem with society as a whole. Our entire society has been (deliberately) oriented around cementing the power of the political class and moving money upwards.

    Government regulation? Regulations are literally written by the companies that are being regulated. What you're really saying is "we need Google to write some legally binding rules that limit how much they can optimize their business". In the end what you get is regulati

  • If every engineer stopped to worry about future ethical concerns, they'd never build anything. It's a well-known fact that any creation by man can be used for both good and nefarious purposes. There is no avoiding that. You can't even try to teach ethics at an early age because the human mind will always a) be suspicious of others telling them what do to or how to behave and b) try to game the system (created by other people) to their own advantage.

  • ... system problems should not be framed as choices that can be made by individual consumers.

    This is precisely the point of "small government". With the consequence that poverty-stricken towns/states can't fix them and an echo chamber of bible-thumpers can force the town/state to obey their rules. Back when universities set school curriculum, they demanded civics, history and philosophy - the tools of leadership - be a requirement of entering university. Then university, like school became focused on marketable skills. Leadership, or lack of it, isn't unique to IT and workers writing software d

There has been a little distress selling on the stock exchange. -- Thomas W. Lamont, October 29, 1929 (Black Tuesday)

Working...