No more gender stereotype ads allowed in the UK

“Our evidence shows how harmful gender stereotypes in ads can contribute to inequality in society, with costs for all of us. Put simply, we found that some portrayals in ads can, over time, play a part in limiting people’s potential,”

Advertising Standards Authority (ASA) chief executive Guy Parker.

The United Kingdom’s Advertising Standards Authority (ASA), the organisation that administers the UK Advertising Codes has banned gender stereotypes ads.

The advertising codes applies to both broadcast and non-broadcast adverts, including online and social media. Many of us perhaps can attest to seeing similar gender stereotype ads on huge billboards on the roads, especially when we travel back on the North South Expressway back to our hometown.

While we wait for the regulator to come in and regulate, as a consumer, we too can play a big role by speaking up and boycotting brands and companies that publishes such gender stereotype ads.

Should we have any minimum social security protections for the gig economy?

The Star published a report on certain e-hailing drivers being banned from a ride sharing company.

Gamification is one of the sales strategies used by companies especially startups to “motivate” their users. For instance, ride sharing companies rewards its users with points which you can then use to exchange for cheaper rides, or even freebies.

For e-hailing drivers, more higher ratings by users gives them more incentives in better payouts. Big brands pay social psychologists a great deal of money to advise them how to exploit human psychology to drive more sales. Even some health services use gamification to encourage healthy eating.

Sadly, gamification doesn’t work really well when one’s livelihood is involved. Presently, we don’t have any legal protection such as social security protection for a worker in the gig sharing economy.

The bigger question we all need to ask is should the government step in and impose a set of minimum social security protections for gig economy workers?

When should we have the AI Playbook?

I read the recent article published on The New York Times about “How Surveillance Cameras Could Be Weaponized With A.I”.

“We face the prospect of an army of A.I. security guards being placed behind those lenses that are actually, in a meaningful way, monitoring us, making decisions about us, scrutinizing us.”

Jay Stanley, senior policy analyst at the A.C.L.U or The American Civil Liberties Union, a human rights advocate based in the United States.

You may have seen a recent video on how Chinese government is tracking people with CCTVs. The same could happen even in the United States in the future. As it’s impossible for humans to sit down for long hours to watch endless streams of videos, artificial intelligence (AI) will be useful in replacing humans to do this job.

The issue with AI is that we have a long way to go before it’s mature enough (estimated 30 years!). So in between, we’re stuck as we still need the cameras to track people especially bad people. Another issue that AI researchers is trying to fix is the unexpected discrimination issues in recent news which says that the facial recognition works better if you’ve lighter skin.

These are some of the reasons why we should have a playbook, or a regulations in place to set the ground rules on what should be the limits of AI when it comes to human surveillance.