Tyrone Glover CEO / President / Credit Coach / Substitute Teacher / Accidental Consumer Advocate 805-428-9424 Schedule a free Forensic Financial Data Analysis
I was an Entrepreneur Masquerading as an Employee #73
The more you learn about something you don’t know! How much you learn, how it’s being used against you! Algorithms & business are not your friends.
I grabbed a piece of this article from the @Guardian. If you get a chance read the entire article. Eye opening and jaw dropping!
Thanks Leo Hickman director/editor of @CarbonBrief.
We make a big stink, when its government in our business! However, we give big business and industry all our data to use as they see fit.
Why do the rich get richer?
Why does the struggle continue for the masses?
Why our financial produce and services priced so insane i.e interest rates or apartment rentals?
Why are employment opportunities, so few, for so many?
Have we given the keys to the hen house, to the fox?
YES WE HAVE!
One could say we are paranoid! The evil mind of the machine! More boogieman talk!
No, I’m more annoyed at the fact; we spend so much time looking for solutions, when a solution is looking for you.
Just a taste! Please go back and read the article in its entirety.
The idea that the world’s financial markets – and, hence, the wellbeing of our pensions, shareholdings, savings etc – are now largely determined by algorithmic vagaries is unsettling enough for some. But, as the NSA revelations exposed, the bigger questions surrounding algorithms centre on governance and privacy. How are they being used to access and interpret “our” data? And by whom?
Dr Ian Brown, the associate director of Oxford University’s Cyber Security Centre, says we all urgently need to consider the implications of allowing commercial interests and governments to use algorithms to analyze our habits: “Most of us assume that ‘big data’ is munificent. The laws in the US and UK say that much of this [the NSA revelations] is allowed, it’s just that most people don’t realize yet. But there is a big question about oversight. We now spend so much of our time online that we are creating huge data-mining opportunities.”
Algorithms can run the risk of linking some racial groups to particular crimes.
Brown says that algorithms are now programmed to look for “indirect, non-obvious” correlations in data. “For example, in the US, healthcare companies can now make assessments about a good or bad insurance risk based, in part, on the distance you commute to work,” he says. “They will identity the low-risk people and market their policies at them. Over time, this creates or exacerbates societal divides. Professor Oscar Gandy, at the University of Pennsylvania, has done research into ‘secondary racial discrimination’, whereby credit and health insurance, which relies greatly on postcodes, can discriminate against racial groups because they happen to live very close to other racial groups that score badly.”
Brown harbours similar concerns over the use of algorithms to aid policing, as seen in Memphis where Crush’s algorithms have reportedly linked some racial groups to particular crimes: “If you have a group that is disproportionately stopped by the police, such tactics could just magnify the perception they have of being targeted.”
Viktor Mayer-Schönberger, professor of internet governance and regulation at the Oxford Internet Institute, also warns against humans seeing causation when an algorithm identifies a correlation in vast swaths of data. “This transformation presents an entirely new menace: penalties based on propensities,” he writes in his new book, Big Data: A Revolution That Will Transform How We Live, Work and Think, which is co-authored by Kenneth Cukier, the Economist’s data editor. “That is the possibility of using big-data predictions about people to judge and punish them even before they’ve acted. Doing this negates ideas of fairness, justice and free will. In addition to privacy and propensity, there is a third danger. We risk falling victim to a dictatorship of data, whereby we fetishise the information, the output of our analyses, and end up misusing it. Handled responsibly, big data is a useful tool of rational decision-making. Wielded unwisely, it can become an instrument of the powerful, who may turn it into a source of repression, either by simply frustrating customers and employees or, worse, by harming citizens.”
So what do we do people?
Do we stop the bleeding, or do we just bleed out?
I personally believe we have lost enough blood!
Every last one of us in the struggle! The working poor and middle class!
Fighting for 15!
Raising the minimum wage!
Fight for Women Rights!
The Trillion Dollar Students!
The Right to Unionize!
All Lives Matter!
Black Lives Matter
Education not Incarceration!
Marriage Equality LGBT right!
Equal Under the Law!
Duty, Honor, Country!
The Inspiring Entrepreneur!
I’m sure you get my point #hastag
Politicians are missing the big picture, because they are out of focus. Excluding Initials (BS) no not bull 4 letter word, and sad to say (RP)!
Not crazy about (RP) but 1% of my brain is listening. However the remaining 99% of my brain thinks you’re full of 4 letter word (RP)!
I uses to be, isn’t relevant to those who are!
I once was, doesn’t apply to I am!
We can does not resonate, because I am, you’re not!
Many of you in the spotlight were once in the struggle. Watching your parents and grandparents, make ends meet with many sacrifices.
Now the sacrifices are so great, millions will only see and end, forget about having it meet.
I love this country, that’s way I served!
You are all my brothers and my sisters! I shall keep you safe! For God doesn’t give us what we can handle; God helps us handle what we are given.
Stay safe in your travels!