Societal expectations are shifting and we’re starting to expect more of our digital services. This is true in a myriad of areas including preventing false advertising, fraud and misinformation while supporting sustainability.
One of the fastest changing expectations is for internet services to pay attention to the experiences of women. This isn’t new but what we’re seeing now is increasing attention from media, policymakers and general society in women’s experiences online as well as Andrew Tate’s viral posts and the rise of incel culture. There are growing calls for platforms to perform a duty of care to users, especially women and girls.
As the Online Safety Bill continues through Parliament, campaigners are asking for legislation to include a code of conduct aimed at combatting violence against women and girls (VAWG). Organisations like Glitch are calling for a public health approach to online safety and pushing for a duty of care to users similar to what every other company has to its customers.
Even if the Online Safety Bill passes without this amendment, it’s clear that times are changing. Further regulation is coming. And our expectations of the online platforms we use are growing.
It’s worth reading the full Violence Against Woman and Girls code of practice created by a coalition of reputable UK organisations.
Key takeaways found in these codes show the continuing and emerging trends in our attitudes towards tech companies and online safety.
- Tech companies need to better reflect the communities they serve. Not a new trend by any means, but the call for tech businesses to employ more diverse talent that better resembles their users is now being tied into their ability to not only serve users appropriately but also into their ability to provide a safe experience. With the continued rise of AI and data-driven technology, we’re sure to see this come into play even more heavily with the potential for in-built biases and the emerging regulation of the sector.
- Commitments to consult in-house on DEI are no longer enough. Gone are the days of tech platforms only needing to employ a diversity champion, or even full safety teams. We’re moving towards a system where online rights advocates want further transparency and external validation. This means companies working with credible external champions, especially in the VAGW space, to hold them to account. The tech industry will need to get comfortable opening their doors to experts to keep public trust.
- Proactive action must be taken. Reactively taking down illegal and some harmful content is now the basic expectation. Increasingly, we believe that social media companies should proactively stop bad content from reaching large audiences and from going up at all. As we’ve seen in the debate in the UK, this can be framed as at odds with freedom of expression. Yet campaigners argue that completely free spaces end up being platforms where only the abusive dominant voices get to participate – essentially censorship of a different kind. We’re seeing a similar debate playing out around Twitter right now. Going forward, companies that attempt to claim that they are not choosing between users’ experiences, or are choosing not to censor, will be seen to have a made choice and criticised for this.
- KYC comes to social media. ‘Know Your Customer’, a term traditionally used in regards financial transactions and frequently referenced in debates around crypto regulation, could be spreading to other online services. Now, understanding who is using your platform is starting to come with expectations – not of exploitation of this data but of being able to serve a safe experience to that particular individual. Things are more advanced when it comes to children, with policymakers exploring the feasibility of requiring age verifications in many markets. This could also extend to other populations. Notably, this movement is about more than enabling users greater choice over what they see but goes farther to ask social media companies to do some of this work particularly over relevant harmful material.
- The ‘systems approach’ is the future of regulation. Functionality, and how it’s tied to a company’s business model, is going to be the way digital services are regulated eventually. We may not be there yet but it’s been the basis of calls throughout the UK’s online safety debates. Seen as a more agile and effective method of ensuring online safety than the approach which criminalises individual actions, such as cyber flashing or epilepsy trolling, the systems approach to regulation has the benefit of being able to respond to changing systems and norms, and won’t necessarily require new legislation. Expect online safety campaigners to keep pushing this point – calling for this to be the method of empowered regulators.