Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Software Technology

We Hold People With Power To Account. Why Not Algorithms? (theguardian.com) 143

An anonymous reader shares a report: All around us, algorithms provide a kind of convenient source of authority: an easy way to delegate responsibility, a short cut we take without thinking. Who is really going to click through to the second page of Google results every time and think critically about the information that has been served up? Or go to every airline to check if a comparison site is listing the cheapest deals? Or get out a ruler and a road map to confirm that their GPS is offering the shortest route? But already in our hospitals, our schools, our shops, our courtrooms and our police stations, artificial intelligence is silently working behind the scenes, feeding on our data and making decisions on our behalf. Sure, this technology has the capacity for enormous social good -- it can help us diagnose breast cancer, catch serial killers, avoid plane crashes and, as the health secretary, Matt Hancock, has proposed, potentially save lives using NHS data and genomics. Unless we know when to trust our own instincts over the output of a piece of software, however, it also brings the potential for disruption, injustice and unfairness.

If we permit flawed machines to make life-changing decisions on our behalf -- by allowing them to pinpoint a murder suspect, to diagnose a condition or take over the wheel of a car -- we have to think carefully about what happens when things go wrong.

This discussion has been archived. No new comments can be posted.

We Hold People With Power To Account. Why Not Algorithms?

Comments Filter:
  • by Anonymous Coward on Wednesday September 19, 2018 @12:15PM (#57342452)

    If a prosecutor or judge uses an algorithm to set sentencing or determine parole, the individual prosecutor or judge should still be held accountable if he was in error.

    This applies if the algorithm is a "paper and pencil" fill-out-a-worksheet algorithm or if it's a complex computational algorithm that the judge or prosecutor can't understand. In the latter case, if the judge or prosecutor can't understand the tools he is using, perhaps he should use less sophisticated tools that he does understand.

    • by goombah99 ( 560566 ) on Wednesday September 19, 2018 @12:18PM (#57342474)

      Yes I have a folder I keep all the naughty algorithms in. If they escape I erase their stacks. Real death

      • Sentence the algorithm to death. Train it with new data so the mistake is not repeated, and the offending version never activated again.

        • Are you certain you won't just wind up with an algorithm that is good at lying to you, shifting blame for its mistakes, or other acts of subterfuge? The algorithm doesn't even need to be conscious of its actions to do those things. If you're using genetic algorithms you might want to be careful of what kind of engineering you're really doing.
          • by AmiMoJo ( 196126 )

            You highlight the real issue - lack of transparency. If a person makes a decision you can ask them to justify it and make a counter-argument. Often these algorithms are proprietary and secret and complete opaque, and with the introduction of AI it just gets worse because we don't even have the tools to understand their decisions.

            • But even with completely open and transparent algorithms, that means nothing.

              The algorithms themselves are actually the least important aspect. As I have said before [slashdot.org], even if the algorithms are 100% open and transparent, that means nothing if the data feed into them is secret. If the bank uses an algorithm to determine if it wants to lend money to you, how is the data about you collected? Who decided to classify you as a say medium risk person? What criteria did he/she/they use for that? How thorough wer

    • by jellomizer ( 103300 ) on Wednesday September 19, 2018 @12:52PM (#57342766)

      This is sometime I keep on trying to state at work.
      This program will help you find things easier. However I cannot program judgement and years of experience, even a learning algorithm may not have data because some things are not recorded.
      Customer X has been a good customer for years, however this month he is behind, this customer actually called the company and let them know that. The computer algorithm will see the late payment, and perhaps send it collection, it doesn't care about the long term relationship.
      An algorithm should be allowed to run, however a human is ultimately responsible to but a stop to an action.

      • I’ve seen this in action. A locally-owned shopping center was sold to a large retail management company a few years ago. They immediately started pushing out the locally-owned businesses and filling the spaces with national retailers. Why? Big companies don’t ever miss the rent. I used to go there at least once a month; I haven’t been to a single one of the new places. It struck me as a penny-wise, pound-foolish move, but I guess they know their business.
        • Big companies do more then miss the rent they just don't pay it, and wrap the property owner in idle threats of lawsuits because of whatever issue they may find.
          Don't bother trying to fight back, it will cost your more in legal fees to fight back. And your politician will turn a blind eye to you because this big company brings in jobs Jobs JOBS!!!!!

    • If a prosecutor or judge uses an algorithm to set sentencing or determine parole, the individual prosecutor or judge should still be held accountable if he was in error.

      If the algorithm that the law requires them to use gives an inappropriate result then it is not the judge or prosecutor who is to blame but those who passed the law requiring that they use the algorithm. This is the same for all algorithms: someone, somewhere has made a decision to use the algorithm and that decision makes that person accountable.

      Even for more complex system which may design or adapt their own algorithms someone, somewhere has decided to put them in charge of something and that's where

      • I see where you're going with this, but as a society (and it's true in mine just as much as yours) we generally prefer checks and balances.

        People screw up too, and we don't blame the person who hired them most of the time.

    • by Hasaf ( 3744357 )

      The flaw I see in your reasoning is that, frequently, the users of the algorithms are people at the bottom of the chain, not those in power. Certainly a judge should be up to date on sentencing guidelines; but what about the clerk in his office who was told to print out the standard sentence for the case under discussion?

      A better example would be the truck drive in a automated truck. That driver had no say in the the choice of the algorithm. To hold that driver responsible instead of the people who made the

  • Since when? (Score:5, Insightful)

    by pablo_max ( 626328 ) on Wednesday September 19, 2018 @12:15PM (#57342458)

    Seriously, the "powerful" have not been help accountable for fuck-all in the US since after the 50's.

    Maybe you should rather start with that before you starting picking on maths.

    • Re:Since when? (Score:5, Insightful)

      by Luthair ( 847766 ) on Wednesday September 19, 2018 @12:18PM (#57342476)
      I think you're optimistic, the only time in human history when people in power are held to account is when they were lynched by a mob.
      • the only time in human history when people in power are held to account is when they were lynched by a mob.

        ...or removed from power by an election. Although given the state of modern politics and I completely understand why you might confuse the two.

        • by Anonymous Coward

          Not re-electing someone is NOT holding them accountable, it is only preventing their future elected-official-actions. Not re-electing them does nothing for actions they have already completed.
          Holding them accountable would be more like fining them money if any bill they voted for is ruled unconstitutional in the future.

          • Not re-electing someone is NOT holding them accountable, it is only preventing their future elected-official-actions. Not re-electing them does nothing for actions they have already completed.

            Not re-electing someone also does nothing about the next bastard whose primary allegiance is to the deep-pocketed special interests who paid for his office. The problem isn't accountability per se. The problem is that politicians don't feel accountable to the voters - they feel accountable to the folks who funded the propaganda that convinced the voters to hand power to them.

        • As a rule that's not the powerful though - that's just the politicians, and they're disposable. The powerful are the ones funding their election campaigns, along with the campaigns of their opponents.

    • Re:Since when? (Score:5, Insightful)

      by lgw ( 121541 ) on Wednesday September 19, 2018 @12:26PM (#57342570) Journal

      They were accountable before the 50s?

      The only thing that ever held those in power accountable was competition from others with power. For most of the medieval period, for example, church and state each limited the excess of the other. Local Baron get to evil with his serfs? Local clergy would call him out on his immorality, even if they were total hypocrites, in order for the church to gain power at the expense of the secular authorities (and vice versa). When the balance of power tilted too far towards the church in the late medieval/early renaissance, we got the Inquisition, as there just wasn't enough secular power to challenge that.

      You can see similar balances of power throughout history, usually between religious and secular authorities.

      Right now we have an entirely fictitious "competition" for power between government and large corporations. But that's all a fraud to deceive voters: they together form the Establishment, all pro-mega-corp all the the time. Voters get a false choice between "more regulation" and "freer market", but that's all bullshit because there are only foxes guarding all the hen-houses.

    • ...on maths.

      Plural?

    • Re:Since when? (Score:5, Insightful)

      by MachineShedFred ( 621896 ) on Wednesday September 19, 2018 @03:22PM (#57343750) Journal

      I was going to say the same thing. I don't accept the premise of the headline, due to absolutely nobody of consequence being held accountable for anything of consequence for the financial meltdown / subprime mortgage fiasco. Once we actually start holding people accountable for things, we can then worry about evil bits being set in registers.

  • by Bugler412 ( 2610815 ) on Wednesday September 19, 2018 @12:16PM (#57342468)
    That's news to me
    • Re: (Score:2, Insightful)

      by ooloorie ( 4394035 )

      It should be. If you exercise your power within the bounds of law and contract, there is nothing to hold you to account for.

      • by Anonymous Coward

        If you exercise your power within the bounds of law and contract, there is nothing to hold you to account for.

        Your theory is sound, however there is ample evidence around us illustrating that theory does not reflect reality. Power corrupts and a significant number of the corrupt rich and powerful get away with their crimes.

        • Re: (Score:2, Offtopic)

          by ooloorie ( 4394035 )

          Power corrupts and a significant number of the corrupt rich and powerful get away with their crimes.

          That's not inconsistent with anything I said.

          What I said is that there it is illegitimate to hold people "accountable" for exercising their power within the bounds of the law.

          What you're saying is that powerful people who violate the law are often not held accountable.

          • >it is illegitimate to hold people "accountable" for exercising their power within the bounds of the law.

            Really? Even when the powerful shape the law to benefit themselves at the expense of everyone else?

            Every malevolent dictator in the history of the world was exercising their power within the bounds of the law.

            • Really? Even when the powerful shape the law to benefit themselves at the expense of everyone else?

              If you think that the laws are corrupt, work towards changing the laws. You apparently want to start a lynch mob, and that doesn't result in justice.

              • I'm not suggesting lynch mobs, I'm just pointing out that laws are pretty much always created to benefit one group at the expense of the rest, so pretending that there is nothing to hold that group accountable for just because they made their crimes legal is disingenuous.

                Or, as I heard it expressed recently "Laws are a complex set of rules establishing who is allowed to steal from who."

                As for changing the laws - how would you suggest doing that? Here in the U.S. the first-past-the-post electoral system mak

                • I'm not suggesting lynch mobs

                  You're suggesting that "powerful people should be held accountable" and you're suggesting that this be done outside the law. When stripped of all the niceties, that means a lynch mob.

                  Of course, if by "holding oligarchs accountable", you merely mean "I refuse to work for Google/Apple/Facebook and I refuse to buy a Tesla", then we agree.

                  As for changing the laws - how would you suggest doing that? Here in the U.S. the first-past-the-post electoral system makes it virtually impossib

                  • >Here in the US, parties don't control representatives.
                    No, but the do *select* the representatives (a bit of Primary theater notwithstanding), and are controlled by oligarchs. So the candidates that make it to the election are pre-selected to be amenable to obeying the oligarchs.

                    I'm not talking about Sanders himself so much as the grass-roots movement around him that's displacing some of the bought-and-paid-for career politicians with upstarts that (might) be less so.

                    As for the alliance between churches

                    • No, but the do *select* the representatives (a bit of Primary theater notwithstanding), and are controlled by oligarchs.

                      Referring to American businessmen as "oligarchs" is absurd. You really have no idea how good Americans have it and how responsive the US is to the will of the people compared to other democracies.

                      I believe it was, when a group of wealthy businessman proposed an alliance with several prominent church leaders to help advance each others agendas as both were rapidly losing political ground.

                      Th

                    • If you've paid any attention to the religious involvement in the anti-abortion and anti-gay-rights movements in the U.S, then you know "government keeping their nose out of things" is categorically NOT what they're after.

                      I'm not saying *all* churches have the problem - plenty of the moderate ones seem to be decent people minding their own business. But there's plenty of far more aggressive ones as well. Hate-mongering sells.

                    • If you've paid any attention to the religious involvement in the anti-abortion and anti-gay-rights movements in the U.S, then you know "government keeping their nose out of things" is categorically NOT what they're after.

                      Historical positions on homosexuality have been all over the map. Socialists and progressives have traditionally been virulently anti-gay, while many churches have been quite tolerant. And abortion was historically promoted in the US as a racist part of the eugenics movement, mostly by Demo

    • Sure when there is a problem we blame the person at the top, he will point his finger down, until it reaches a person who can no longer point to anyone else. Then they are held accountable.

  • a "report"? (Score:4, Insightful)

    by cascadingstylesheet ( 140919 ) on Wednesday September 19, 2018 @12:17PM (#57342472) Journal
    This is an editorial.
  • by iggymanz ( 596061 ) on Wednesday September 19, 2018 @12:20PM (#57342510)

    The software in life critical and safety systems indeed is already held to account. Conviction for a crime requires humans who review evidence and its veracity. Privacy depends on your lawmakers, some countries have a mindset of respecting it, others don't and wont' throwing the buzzword "AI" into a sentence doesn't change the problem that it's about software in general, whether or not a marketer slaps "AI" label on it

  • let's be precise (Score:4, Interesting)

    by ooloorie ( 4394035 ) on Wednesday September 19, 2018 @12:22PM (#57342526)

    We Hold People With Power To Account. Why Not Algorithms?

    We don't generically "hold people with power to account". The law requires that legally competent adults comply with legally binding agreements they have entered (employment contracts, etc.), and the law punishes criminal behavior. Legal competency requires free will and agency, neither of which "algorithms" possess.

    If we permit flawed machines to make life-changing decisions on our behalf

    The ABS braking algoritihm in your car makes "life-changing decisions" you your behalf. It is you (the car's owner) and the manufacturer who are responsible for those decisions, depending on circumstances.

    On the other hand, doctors make life changing decisions all the time, frequently get it wrong, and frequently are not held accountable. Nor should they be: when you make life changing decisions with limited information, you often get it wrong. That's not a flaw, that's life.

    • by davidwr ( 791652 )

      It is you (the car's owner) and the manufacturer who are responsible for those decisions, depending on circumstances.

      Your point is well made but you left out the driver (likely "you"), the lessee/renter (likely "you"), the car's owner if it's not you, the people who repaired the brakes or repaired the car in any way that would affect the braking system, and possibly others.

      Liability can get complicated. Sometimes it takes lawyers, a judge, and a jury to untangle the mess.

    • by Kjella ( 173770 )

      On the other hand, doctors make life changing decisions all the time, frequently get it wrong, and frequently are not held accountable. Nor should they be: when you make life changing decisions with limited information, you often get it wrong. That's not a flaw, that's life.

      And very often you can punish that person, but the system can't change. Every year experienced doctors retire, every year new inexperience doctors have their first patient. Now humans are really great at improvisation but algorithms are really great at accumulation. You can put them in a test environment or a simulation and keep tweaking it until it does what you want. And when you find new breaking points you can enhance it and make it better. My uncle witnessed an old lady run over somebody in the crosswa

    • Sorry, but this is a sticking point for me. Every place I've ever been hired is at will. Lately I've been hired as a contract-to-hire so the company didn't have to pay unemployment if they didn't want to keep me. My bro just changed jobs and he's taking a huge risk in a desperate bid to get a promotion before he's too old to work anymore. As an employee you have zero rights and tons of obligations. This is why we had Unions.

      To be fair this is an American perspective. I understand things are better in Eur
      • Sorry, but this is a sticking point for me. Every place I've ever been hired is at will.

        I wasn't talking about the company's obligation to you, I was talking about your obligation to the company.

        I understand things are better in Europe and Australia.

        For some people. For most people, they are worse.

  • by argStyopa ( 232550 ) on Wednesday September 19, 2018 @12:24PM (#57342548) Journal

    In the OP's posted story, Robert was the dumb fuck that almost drove off a cliff.
    You cannot hold algorithms accountable, they're NOT PEOPLE. They cannot be punished. They don't feel remorse.

    All we can do is to explicitly build a legislative system that follows the trail back to the human that gave the algorithm that power.

    If Bob is driving a car, it's STILL Bob's responsibility to watch to damned road.
    If Bob is sold a self-driving car with the written assurance from the dealer that this car will drive itself in conditions a, b, and c, if Bob gets killed during a, b, or c, ultimately the dealer is liable at LEAST for manslaughter, worse if they knew it wasn't perfected.

    If the dealer was assured by the manufacturer, then the manufacturer is responsible. I would even say all the way to personal liability the person or group of persons who signed-off that this *was* capable.
    Don't like that risk, Mr Auto Executive? Then don't sign off that X is safe until you're willing to take that risk.

    (And I don't know if I'm just excessively cynical, but I don't see a lot of "holding people with power" to account EITHER. Hell, I don't see that holding ANY people to account - even for the logical consequences of their OWN CHOICES - is much of a priority in our society.)

    • " this car will drive itself in conditions a, b, and c"

      Those probably include doing proper maintenance which isn't always followed. If Bob lets his tires get too worn, then some or all of the liability falls on him. Again, that's what courts are for. They can work out those more complicated scenarios.

      Also, unpreventable accidents will happen that can lead to death when you have high speed transportation. Random nails in the street can cause a blowout that can cause a vehicle to go out of control and slam in

      • ....you put your finger on a key point, is that - despite lawyers' insistence to the contrary - there are circumstances where NOBODY is blameable (or the blame is too diffuse to assign, like your nail scenario).

        Likewise, if we as a society accept AI driven cars:
        - this is a preponderance, not unanimity. There are going to be people who say "I never accepted this" but will be facing risk they didn't agree to.
        - there will be a proportion of incidents that are either the result of avoidable-but-unanticipated r

    • Yes. People made the algorithms. People are responsible for them.

  • Do you seriously think Google is not analyzing a million ways how useful the pages of results are after the first?

    For price comparison sites, there are LOTS of consumer blogs comparing them down to the penny against every conceivable way of purchasing the flight/service/good. The service itself is very likely doing some checking also but there is a LOT of independent external validation happening...

    The same is true for anything actually IMPORTANT. The algorithms that slide by without much analysis are onl

  • by Archangel Michael ( 180766 ) on Wednesday September 19, 2018 @12:30PM (#57342604) Journal

    We rarely hold people with power accountable. Instead, decisions are made in committee and by a series of processes that obfuscate and remove culpability of decisions from those that are authorizing it.

    Lets say to do Action A, it requires approval by several committees or individuals. We'll keep it simple.

    Committee 1 votes to do Action A (sounds like a good idea)
    Person 1 checks to see if Action A(b) violates some metric (it doesn't)
    Person 2 checks to see if Action A(c) passes certain functional tests. (it does)
    Person 3 verifies results of tests A(b) and A(c) (it does)
    Committee 2 finalizes approval of Action A

    Action A causes massive death due to Action A doing something nobody checked for. There is no person responsible for this, it was just a bad "accident". However, looking back, Action A was a bad idea from the start, but it passed all the tests. Nobody is responsible.

    Hillary can say with clear conscience that her signature on Uranium One deal was only one of 17 required, it isn't her fault. Even though the sale of Uranium to the Russians was stupid idea, no one person can be blamed. No accountability. The buck stops in committee.

    • As an example in the 2008 market crash we knew exactly who was responsible. There was no talk whatsoever of who was or wasn't at fault. The entire thing was played off as a hostage situation where the people who caused the crash had to be propped up or they'd take the entire economy with them. Since we couldn't just have the gov't let them fail and step in to prop the economy itself up (that'd be socialism, which is bad m'kay) we bailed them out with no penalties.

      We know who the 1% are too. We know exac
    • In that case, the fault lies with the person/people who created the approval process.
  • First, we do not (Score:5, Insightful)

    by gweihir ( 88907 ) on Wednesday September 19, 2018 @12:30PM (#57342608)

    Be powerful enough and you can commit almost any crime and get away with it. Second, you cannot hold an abstract concept accountable.

    • by Anonymous Coward

      Therefore, Trump is an abstract concept.

      • by gweihir ( 88907 )

        Invalid interpretation of an implication as an equivalence. A beginner's mistake, albeit common.

    • Second, you cannot hold an abstract concept accountable.

      But we have no problem declaring war on one. (Drugs, Poverty, Crime, etc.)

  • We do? (Score:2, Insightful)

    by Anonymous Coward

    When was the last bankster to go to prison?

  • Algorithms are used in navigation solutions, PID loops are robotics and manipulation of the real world by software because those feedback rules are just combinations of math and dirt simple physics. Part of just a few feedback or other previous state locked loops. 99.99% of all human automation works perfectly for years by limits or corrective feedback.

    Medicine outside just a few drugs and trauma treatments cannot enter the mid 90% in treatment success. The 10% shortfall is amazing in trend findin
  • ... of that is not true.

    We have a racist, self-admitted pussy-grabbing right wing batshit crazy evangelical Christian intolerant ass hat at the pinnacle of power and he's Teflon coated.

    So, start over.

  • People can change their minds (not that they DO) ...

    Not so, algorithms.

    #NoThoughtWithoutAThinker
  • "If we permit flawed machines to make life-changing decisions on our behalf -- by allowing them to pinpoint a murder suspect."

    Why not punish laws as well if they condemn life-changing stuff like abortion, homosexuality, wrong bathroom use ...
    After all they are algorithms too.

  • If we permit flawed machines to make life-changing decisions on our behalf ...

    I don't think you understand how things work.

    You say that we *already* hold people to account; who do you think writes the algorithms? Now go away "Dr." Hannah Fry (the report author).

  • when there are till people/companies responsible for implementing/coding/teaching the algorithm?

    Algorithms don't just magically appear out of thin air.

    It's like blaming a jackhammer when a contractor uses it instead of a nail-gun to install shingles.

  • I can't comment on all of these, but as to who is going to verify the algorithm for finding the shortest route works? I'm quite sure that has been thoroughly vetted.
  • by RonVNX ( 55322 ) on Wednesday September 19, 2018 @12:58PM (#57342816)

    We _do not_ hold people with power to account. So this is a nonsense question.

  • We should just treat algorithms as what they are, a tool. So if an algorithm fails, then the liability / accountability should flow as it would any other tool. Did the developers build a flawed algorithm? Then they should be liable for the damage caused. Did the hospital, agency, etc. incorrectly deploy or use the tool? Then they hold responsibility. Does the developer not know why the tool failed? Well that's your problem, and you're still responsible for it.
  • by zarmanto ( 884704 ) on Wednesday September 19, 2018 @01:28PM (#57343014) Journal

    People often look to the algorithms of things like a GPS navigator or a news aggregator with the notion that it's always going to spit back results which are somehow "better" than whatever a human could have come up with... and to a certain extent, that might even be true. The thing that we don't always bother to ask is, what specifically did the human programmers of that algorithm decide to define as "better"? In some cases, it's a matter of what's least expensive, because that's what the consumer/end-user wants. In other cases, it's a matter of what's most profitable, because that's what the "real" customer wants. And in a few cases, "better" could easily be nothing more than taking that whole damned decision making process out of the end-user's hands, just so that they don't have to think about it.

    Take GPS as an obvious potential example of this latter scenario: In many cases, there are a myriad of different possible routes which will all get you to the same destination in roughly the same time-frame -- barring obvious slowdowns, like a major accident on one of those routes. If you happen to know several such routes yourself, try testing your GPS: go "off route," and see what happens. I've conducted this test myself a few times, in one instance even going off route multiple times over the course of a drive... and the GPS happily rerouted and recalculated the estimate time of arrival each and every time... and outside of taking an obviously ridiculous route, the GPS's ETA only rarely extends beyond a few minutes different from the very first ETA that it had offered me, at the beginning of my trip. And yes... now and then, I can even manage to beat the GPS's estimate. (Your mileage may vary, and all that good stuff.)

    So it's not always about getting the algorithm to help you find "the best" option... sometimes it's just about making a decision, and running with it. The same paradigm could easily be applied to many other decisions that we make in life. It hasn't been pushed quite this far yet, but consider: "Should I wear my blue shirt with tan slacks today, or the red shirt with black slacks?" "Should I have Moe's for lunch, or Chick-fil-a?" "Should I wear Old Spice or Ax, today?" "Boxers or briefs?" "Straight tie or bow tie?" Ohhhhhhh, the decisions!

    Now, these are of course pretty far outside of the norm... most of us can usually come up with our own answers to these common everyday decisions. But that's just a few minor examples of the direction that things could go, once the machine has been supplied with enough of the right (?!?) data. And mark my words: if you can find a decent way to make the machine do it, there will be an audience willing to pass off even these minor decisions to the "wisdom of the machine."

    And why not? After all, making decisions is, in-and-of-itself, just one more piece of stress in our lives. And who needs unnecessary stress... right?

  • Software is gradually replacing human knowledge - eg a driver with GPS has a measurably smaller area of the brain than a map-reader. This is even more true in business, where a person is more likely to be a 'computer operator' than actually have a grasp of the fundamental principles. Is there any point knowing something deeply when a two minute Google will yield the same answer?!
  • and we see how well it works that the individuals behind them are held accountable.
  • However the humans who certify them, and authorize their use and scope, do.

  • Because algorithms do not kill people, people do.
  • What? When? I would absolutely LOVE for people in power to be accountable for their actions! If only there wasn't this thing called... money.
    Making an example of one every now and then to please the public is not enough. We all play by the same rules or we don't play at all.

  • We Hold People With Power To Account.

    Huh? We do? News to me. Actually I think current headlines demonstrate that we don't.

    Why Not Algorithms?

    Because you can't fine and imprison an algorithm. A similar problem exists with holding corporations accountable. It's easy to solve but we've refused to do it for the last 50-100 years, why start now?

  • long division should be punished, for the pain it has inflicted on countless schoolkids.
  • If we already define corporations as persons under the law, why not make algorithms persons, too?

    Then we can hold algorithms to account and not sound like the blithering idiots in doing so.

  • Whenever I make use of an errant algorithm in my computations I can't get anyone to old the algorithm accountable. Instead they just yell at me.

The rule on staying alive as a program manager is to give 'em a number or give 'em a date, but never give 'em both at once.

Working...