PREVIOUS ARTICLENEXT ARTICLE
MISC
By 3 May 2019 | Categories: Misc

0

VIEWING PAGE 1 OF 1

This glimpse of the future was crafted by Matt Aberline and Dave Mackenzie, Aurecon.

Big data is a big mystery these days. On one end, technology and its spectacular advancements has been touted as humanity’s royal flush for the 21st century – a trick hand to leapfrog our own extinction and win the future. On the other end, the idea of an ever-expanding, data-driven matrix could be kind of scary – okay, very scary – and may even resurrect a rather gloomy mood, with grey scale apocalyptic imagery on replay.

The truth is – it’s probably somewhere in the middle. Maybe there is an ‘eau de robopocalypse‘ in the air. Nevertheless, there’s immense opportunity at hand, should we continue to harness and wrangle it down for the betterment of society.

Data is all the more driving business decisions and pioneering new cryptographic value, which is fundamentally altering the way we work. Machine learning and data science are fast becoming the company MO to steer better value-add strategies. Blockchain technology and its golden child, bitcoin, are unlocking a new epoch of digital financing.

Is all this to say, big data is the big answer? With AI’s unrelenting tsunami of capability fast approaching, where does this leave the human in the mix?

Bugged with bias

You’ve heard it said before that “the data doesn’t lie”. But organisations like AlgorithmWatch and The Algorithmic Justice League have made it their life work to otherwise disagree. By feeding our algorithms historical data that was largely determined by an innate cultural bias or discrimination, we’re in essence perpetuating that same bias.

We lug with us, unwittingly, a set of accepted norms around determinants like gender, race, age and one’s level of education, and then feed it into the data science to inform our outcomes. Not surprisingly, what comes out as insight and recommendation is furthering the same parochial patterns of biased behaviour.

Take, for example, the case of the US Justice Department, which implemented software to help determine the severity of a criminal’s sentence by predicting that person’s risk of re-offending. Although the formula claimed to exclude race or ethnicity as variables, black people clearly came out on the losing end – wrongly labelled as future criminals at almost twice the rate of white defendants; and only 20 percent of those people predicted to commit violent crimes actually went on to do so. Facial recognition software continues to flub as a failproof form of identification – with far more accurate outcomes to favour white men and implicate black females.

From court sentencing, recruitment, and targeted marketing, to digital humans and customer service, these kinds of data-driven models are increasingly shaping the world we live in. So, best we keep innovating and intervening on all fronts, in order that our own tools won’t dig our demise deeper.

Ethics will only become more important

The ethical nature of data collection will only become more paramount as the value of data continues to increase. In the future, organisations will have Chief Ethics Officers in the same realm as they have Chief Finance or Chief People Officers, such will be the demand.

Human intervention will remain critical. The reality is that no software or clever app is going to self-refine its comprehension systems or auto-correct inherent human bias. It will take people to do that.

Every AI system is the product of countless hours shared among teams of developers who organise extensive datasets and write the algorithms to bring their technology to life. More than ever, developers need to iterate, test and act as gatekeepers of the data annotation process – systematically addressing and refining their data feeds.

But gatekeeping at the AI inception stage is not the only imperative. It took a watershed moment like the notorious Facebook-Cambridge Analytica data scandal to illustrate what happens when big data analytics go unchecked on a societal scale. “Psychographic” algorithms not only swayed millions of Facebook users to influence the US electorate; the same analytics have been used in broad scale initiatives such as “military influence campaigns” to “conduct behavioural change programmes in over 60 countries”.

Machine learning and AI are becoming both the weapons and tools of our future. All the more, we need regulatory safeguards (and brave, smart renegades) to firewall their unmitigated capacities and educate people on the risks and rights involved with availing personal data into the IoT space.

Building the blockchain

As blockchain technology advances, we’ll eventually be able to turn just about anything into digital gold. Far beyond cryptocurrency, blockchain could unlock new value, liquidity and speed to market for businesses, real estate and other property-based assets.

The digital permanence of a distributed ledger system will be a game changer for the world of supply chain communication needing proof of provenance across a diverse stakeholder landscape. Electronic voting, payroll, cloud storage, even booking a hot desk – all these applications and countless more will be transformed by blockchain.

But the reality is, we’re not there yet. Because the ledger can only handle so many transactions per second, the speed of tokenising is still relatively low. Software developers have to balance the pressure of moving fast while building meaningful, cyber secure solutions.

Critical to this process will be the self-automated smart contract that will hold stakeholders to accountable, unbreakable terms. But even if our tech allows it, should we be tokenising everything? Aren’t there some things better left alone? Absolutely.

Blockchain and distributed ledger technology (DLT) technologies imply trust and permanence. Therefore, we don’t want to tokenise assets or data where we can’t trust the quality of the source. A WWF app, for example, now allows you to scan your can of tuna to identify details around where and how the fish was caught. Because the dataset is accountable to the source, consumers can be confident that their purchase is ethically and environmentally responsible.

On the other hand, organic pasta would be considerably difficult to tokenise; every ingredient would have to be known and tracked to its source. It’s essential we think our way through these processes and build the relevant technologies line upon line.

Data will solve some of humanity’s biggest conundrums, but it won’t solve everything. We’ll need people to drive the process and continually ensure the integrity of our data to improve insights and inform better solutions. Data alone is not the silver bullet, but when paired with human-centric design, it very well could be.

With ‘all things digital’, a generous amount of ambiguity and experimentation is always to be expected. But one thing seems fairly certain: big data will demand more of people, not less, as our digital epoch unfolds.

VIEWING PAGE 1 OF 1

USER COMMENTS

Read
Magazine Online
TechSmart.co.za is South Africa's leading magazine for tech product reviews, tech news, videos, tech specs and gadgets.
Start reading now >
Download latest issue

Have Your Say


What new tech or developments are you most anticipating this year?
New smartphone announcements (43 votes)
Technological breakthroughs (27 votes)
Launch of new consoles, or notebooks (14 votes)
Innovative Artificial Intelligence solutions (28 votes)
Biotechnology or medical advancements (21 votes)
Better business applications (132 votes)