World Wide Web Foundation https://webfoundation.org Founded by Tim Berners-Lee, inventor of the Web, the World Wide Web Foundation empowers people to bring about positive change. Wed, 21 Dec 2022 17:25:28 +0000 en-US hourly 1 https://wordpress.org/?v=6.3.2 https://webfoundation.org/docs/2017/09/favicon-icon-75x75.png World Wide Web Foundation https://webfoundation.org 32 32 With the web becoming an increasingly monitored space, each of us has a role to play in safeguarding online privacy https://webfoundation.org/2022/12/with-the-web-becoming-an-increasingly-monitored-space-each-of-us-has-a-role-to-play-in-safeguarding-online-privacy/?utm_source=rss&utm_medium=rss&utm_campaign=with-the-web-becoming-an-increasingly-monitored-space-each-of-us-has-a-role-to-play-in-safeguarding-online-privacy https://webfoundation.org/2022/12/with-the-web-becoming-an-increasingly-monitored-space-each-of-us-has-a-role-to-play-in-safeguarding-online-privacy/#respond Wed, 21 Dec 2022 17:25:26 +0000 https://webfoundation.org/?p=19467 This post was written by Carlos Iglesias, Senior Research Manager and originally published for The Lloyd’s Register Foundation World Risk Poll.


Across the world, in every culture and society, there is a notion of privacy and freedom. Those …

The post With the web becoming an increasingly monitored space, each of us has a role to play in safeguarding online privacy first appeared on World Wide Web Foundation.

]]>
This post was written by Carlos Iglesias, Senior Research Manager and originally published for The Lloyd’s Register Foundation World Risk Poll.


Across the world, in every culture and society, there is a notion of privacy and freedom. Those may be established in different forms or fall under different categories, but they do exist, underpinned by international frameworks like the Universal Declaration of Human Rights. As our lives are increasingly played out online, the rules that protect the universal and fundamental right to privacy are of paramount importance.

Today, the Web is one of the principal means by which individuals exercise their right to freedom of expression and information. It provides essential tools for participation and discussions concerning political and general interest issues. Online platforms are becoming the new public squares, where public discourse and debate take place. They provide a space for communities to come together, discuss, share ideas, and make change happen. They are now our town halls and our cafes, our libraries, and our newsagents.

A whole new set of privacy challenges

At the same time, fuelled by a drive towards more powerful and complex data analytics, and by the increasing amount and granularity of personal data available today, more and more of our online lives are under observation. We now face a whole new set of challenges regarding our privacy and data rights. Not just for the indelible data portions of our digital identity that we frequently and actively share ourselves, but also due to the sheer number of additional data points that are generated or attributed to us through monitoring and inference from our online activity – all too often without our knowledge.

According to the latest Lloyd’s Register Foundation World Risk Poll report – A Digital World: Perceptions of risk from AI and misuse of personal data, at least three out of four internet users worldwide are ‘very’ or ‘somewhat’ concerned that their personal information online could be stolen (77%) or used by companies for marketing purposes without their permission (74%). In addition, two-thirds of them (68%) are equally concerned that their personal data could be used by the government without their permission.

While we are more aware of the perils of persistent online tracking, targeted advertising, and surveillance capitalism, we still don’t know much about how all that really works in the background, and to what extent it might affect our online and offline lives and privacy. This clearly limits our capacity to react, object and opt-out, generally leaving us without alternatives.

A deepening feeling of inequality

Moreover, the online world always mirrors and frequently amplifies offline harms and inequalities and, as a result, marginalized groups are also more acutely affected by safety and privacy issues. The World Risk Poll report also shows us how lower-income internet users or those who said they had experienced discrimination based on their skin colour, race/ethnicity, sex, religion, or disability status were also more likely to be very worried about possible misuses of their private information. This demonstrates how privacy challenges contribute to deepening inequality and make the web feel like a less welcoming place.

There are increasing calls for new approaches towards respecting the data rights of individuals and communities and allowing users to retain a sphere of privacy and autonomy to explore the web freely and without the constant threat of interference or surveillance. People are demanding greater control over their privacy and stronger data rights. Strong privacy rules enable data use in a protective way. They stablish guidelines for the appropriate use of data based on its purpose and the surrounding circumstances.

Our right to privacy demands more ambitious policy proposals and solid regulatory data protection frameworks. We need to start questioning current practices and look for more effective solutions. People and their privacy must be at the forefront of all discussions. That’s why the Web Foundation is calling on governments, companies, and citizens to build a new social Contract for the Web and adopt a shared set of commitments for respecting, protecting, and fulfilling people’s online data rights.

Governments must preserve people’s personal data by establishing and enforcing comprehensive data protection frameworks and rights to foster online trust. Companies must give every user options to access online content and use online services that protect their privacy. Citizens should take proactive steps to protect their individual and collective privacy and security by demanding privacy-enhanced products and services and articulating privacy preferences accordingly.

Everyone has a role to play in safeguarding privacy online so we can use the internet safely and without fear.

The post With the web becoming an increasingly monitored space, each of us has a role to play in safeguarding online privacy first appeared on World Wide Web Foundation.

]]>
https://webfoundation.org/2022/12/with-the-web-becoming-an-increasingly-monitored-space-each-of-us-has-a-role-to-play-in-safeguarding-online-privacy/feed/ 0
Online Gender-Based Violence Story – Maria, Costa Rica https://webfoundation.org/2022/12/online-gender-based-violence-story-maria-costa-rica/?utm_source=rss&utm_medium=rss&utm_campaign=online-gender-based-violence-story-maria-costa-rica Thu, 01 Dec 2022 23:00:43 +0000 https://webfoundation.org/?p=19447

Activists are particularly vulnerable to online gender-based violence, especially when they threaten profits and challenge corporate greed. Maria’s story shows the potential for what can be achieved when women feel empowered to use digital technology to advocate for their communities,

The post Online Gender-Based Violence Story – Maria, Costa Rica first appeared on World Wide Web Foundation.

]]>

Activists are particularly vulnerable to online gender-based violence, especially when they threaten profits and challenge corporate greed. Maria’s story shows the potential for what can be achieved when women feel empowered to use digital technology to advocate for their communities, and are able to overcome the threats that they face.

Maria, 45, has been working the land since she was born. She loves the beautiful place she grew up – where the rainforest meets the sea – as well as the Afro-Caribbean culture and ways of life of her community.  Many generations of her family have lived here. Her family taught h

er to work with a variety of crops and farm animals. It is very important for her community that the small farms can ensure food security. Together with other women in her community, she is committed to maintaining the dignity and pride of being farmers.

In Costa Rica it is said that there are two countries, that of the great metropolitan area where the opportunities and services are centred and the coastal and border areas where people live in more vulnerable conditions. This is exacerbated by multinationals who, for the past 20 years, have been settling in the poorest areas of the country to plant monoculture crops, threatening not only the ecosystems, but also  inhabitants of the communities who are expelled to make place for large plantations.  

Where Maria lives, pineapple cultivation has been expanding at great speed. The water of the rivers and seas has been contaminated with chemicals, and the area has been inundated with pests related to pineapple farming, such as flies. 

Distressed by these changes, Maria set out to defend her beloved land, culture and community’s way of life. She understands that food security must be defended, that water sources must not be damaged, that biodiversity must be preserved, and that the way her community lives, based on generations of farmers and rural culture, must be protected. 

So she and her companions decided to start organising the community. They began to use WhatsApp to create work groups, send documents, and make calls for people to join the struggle against the multinational. They begin to use social networks to build a network of alliances at the national and international level. They use photographs taken with their cell phones to document evidence of the environmental and social impacts of the pineapple plantation. They also used digital communication to advocate their cause with journalists, congress and lawyers, encouraging these influential individuals to take up their cause. 

Growing a mass movement

But this digital activism opened Maria and her allies to abuse, as is often the case with highly visible female activists. The same channels Maria used to create a social movement were also used to persecute and threaten the activists and their families, especially their daughters. As Maria’s efforts expanded into a national movement against abuse by multinationals, attempts to silence and discredit Maria and her allies intensified, with personal details about her life shared in an attempt to intimidate her. 

Although Maria was scared by these threats, she has continued her work, as the problems which led to her activism remain in her country and community. After more than four years of work, this network of organisations led by María and her colleagues succeeded in limiting the expansion of pineapple in the country, but the multinationals have not gone away, and they are constantly pushing to appropriate land.  

Students, social movements, political parties, lawyers, journalists and, above all, other communities have joined the struggle against monoculture. All communication is done through digital media, and, because of the experiences of the women at the front of this struggle, extra has been paid to ensuring that digital safety and security is prioritised as part of planning and coordination. This is a constant struggle and women are leading the way.

This story was told with assistance from Sula Batsu, a member of the Women’s Rights Online (WRO) network.

The post Online Gender-Based Violence Story – Maria, Costa Rica first appeared on World Wide Web Foundation.

]]>
Online Gender-Based Violence Story – Aisha, Nigeria https://webfoundation.org/2022/12/online-gender-based-violence-story-aisha-nigeria/?utm_source=rss&utm_medium=rss&utm_campaign=online-gender-based-violence-story-aisha-nigeria Thu, 01 Dec 2022 22:44:14 +0000 https://webfoundation.org/?p=19441

A prominent activist of one of the most visible online campaigns in recent years reflects on a life led with the constant threat of attacks and online gender based violence, and the need for resilience as well as policy changes

The post Online Gender-Based Violence Story – Aisha, Nigeria first appeared on World Wide Web Foundation.

]]>

A prominent activist of one of the most visible online campaigns in recent years reflects on a life led with the constant threat of attacks and online gender based violence, and the need for resilience as well as policy changes to protect online diversity. 

Aisha, co-convener of the famous “#BringBackOurGirls” movement, a movement pushing for the rescue of the 276 girls kidnapped from a boarding high school in Northern Nigeria by a terrorist group in 2014, never calls herself an activist or even a human rights defender. She says she is an ordinary citizen, who is active. 

Aisha was born and bred in Kano, in Northern Nigeria, and remembers challenging authority, from as far back as when she was only 4-years old. She was a very vocal child, sometimes to the chagrin of close family and friends. Her father was a strong vocal personality, who believed in girl-child education and encouraged his girl-children to have independent informed opinions about current issues. 

In 1992 while in the university, Aisha joined her first public protest. It had been against the clear instruction of her parents who were afraid she may be expelled from University. They had paid a huge price for Aisha to have access to quality education despite their financial difficulty which made it even difficult to feed the family at times. Aisha’s dream from childhood was to be financially independent. “When you are poor, you are nameless, faceless, and voiceless,” she says, and she gets livid when she sees politicians weaponizing poverty. “It’s a debilitating thing. It cancels you. The greatest ‘cancel culture’ is poverty, nothing else. People look right through you and don’t see you at all.” 

The beginning of a crisis 

On February 25, 2014, 59 boys were killed at the Federal Government College of Buni Yadi in Yobe State, Nigeria. 24 buildings of the school were also burned down as a result of the attack. No group claimed responsibility for the attack, but according to media and local officials, the radical Islamist militants Boko Haram were suspected to be behind the attack. 

Barely two months later, 276 girls were kidnapped in the dead of the night from their high school dormitory by the same terrorist group. On the 30th of April, Aisha marched in street protests for the rescue of the girls to the Nigerian National Assembly. Aisha did not even know about ‘Twitter’ until a few days afterwards. It had been planned at the protest that protesters were to share ‘the #BringBackOurGirls’ hashtag hourly. So Aisha downloaded the Twitter app, and her very first tweet was ‘#BringBackOurGirls.’ 

The online movement grew, drawing enormous local and global attention, putting the government of the day under immense pressure and forcing them not only to focus on the kidnapped girls but the issue of terrorism happening in  North East Nigeria more broadly. 

An intensely personal pain 

The kidnapping forced Aisha to sit back and reflect on her past and this forced reflection shook her deeply. “31 years ago, I was the ‘Chibok girl’ who would have been kidnapped.” says Aisha. “I was fighting to stay in and to finish school. I was fighting to take care of myself and my family.” During the protests, as she led the march to the office of the then chief of defence staff,  Aisha had broken down and wept. “I just put my head in my hands and I cried. It was a lot of emotions to deal with. It had been very difficult for me to go to school, because many people in my community did not want the education of a girl-child. I know what it is like being insulted and attacked for being in school. And I knew if I would have been abducted while writing my final exams in high school in 1991, my parents would never have been able to challenge anyone into action because they were poor and no one would have paid them any attention. I knew I could have been abducted several years ago and not been alive today. So for me, the whole thing became quite personal.” 

The ‘#BringBackOurGirls’ movement continues to have up to 100 active members at any given time, and is given to peaceful protests and advocacy with government for the rescue of all kidnapped girls in captivity of terrorist groups in Nigeria. Beyond these issues, Aisha has gone on since 2014 with consistent activism to demand for government accountability and an end to impunity, pushing back against human rights violations, advocating for women’s rights and for democracy and good governance. She actively criticises government’s work in context of policies and programmes and as expected, she has attracted both good and ‘not-so-good’ attention, making quite a number of enemies.  

Refusing to be silenced

Aisha is very vocal on Twitter and is particularly critical of the Nigerian President and his government. She has, inevitably, faced abuse for her activism. “People expectedly have come after me but I typically don’t mind what people are saying. The aim of these online attacks is to silence you. And one thing I have noticed is that a lot of people’s voices have been shut down by these attacks online, especially female voices. Many are now afraid they will be attacked and are afraid of airing their views. That’s why I take on the bullies, I learned that bullies thrive on people’s silence. They will come in droves but we need to fight back.”

Aisha has also been constantly abused online about having a “big mouth,’ with someone once tweeting a picture of her side-by-side with a baboon. She has been called “mentally challenged” due to a curse, while others accuse her of being an ‘immodest woman’ especially for leading protests wearing the Hijab. People often also spread lies online about Aisha, for instance insisting that she is being sponsored by some foreign or partisan groups, or that her husband who is a retired public servant has been convicted of corruption. 

In 2019, a decisive election year in Nigeria, the attacks advanced beyond being called names online. “In the year 2019, it would seem that about every two months or so, my Twitter account would be taken down. I think there was mass reporting. I didn’t do anything wrong. And I said if Twitter would get my account suspended, I’m not going to come back to Twitter again. But there were people who actually took up the case and worked on it and then Twitter kept restoring my account each time this would happen.” 

All in a day’s work

Aisha generally thinks of being attacked offline or online as a job-hazard. She isn’t quiet in the face of bullying and doesn’t run from a fight. She had joined the 2020 #EndSARS protests in Nigeria with excitement and enthusiasm, happy that the Nigeria youth had finally woken up and were taking the lead in the struggle for human rights. She had taken to Twitter to encourage the young Nigerians leading the protests to focus and avoid all distractions, until the government met their conditions. 

Aisha is used to triggering people just for being a woman who does not fit the stereotypes. However, some attacks have been too sinister for Aisha to gloss over. 

“I can ignore a lot of things,” Aisha says. “But not things like being looked upon as a terrorist. For an investigative journalist to allude to the fact that I am a terrorist-sympathiser on Twitter? People asked me to just ignore him, but I know where he’s coming from. When someone is linking a Muslim hijab wearing woman to terrorism, that is deliberate, they know what they are doing! You can’t just ignore them.” Livid, Aisha dared the journalist to come forward publicly with any data he may have on her and she fought back.

“Silence encourages bullies. There is such a thing as self-defence. People threaten to beat me up and say openly they’re going to kill me. People called for my killing during the #ENDSARS protest, there was an actor that mentioned that I should be lynched. I have three sisters who look like me, and some people actually confuse us for one another. One of them, her husband, has had to make her cover her face completely, because of harassment she has been subjected to publicly. Another was once attacked in the market. So online attacks sometimes are not just exchange of words, they can also have serious ramifications offline.” 

Aisha is very conscious about her security. Geolocations features are permanently turned off on phones and apps and she makes no posts about where exactly she is at or about where she is going to be. But she also doesn’t take threats too much to heart because she thinks that if it’s her time, she’ll die anyway. “When it’s time for my death, I’m going to die anyway, and if anyone wants to take that decision to be the one to kill me, fine and good.”

Finding a safety net

In the midst of these attacks, Aisha says she has had tremendous support from various stakeholder groups. When her Twitter account was being repeatedly taken down, people stood up for her.

“A lot of people in tech and advocacy fought for me and reached out to help me. Nigerians defended me on Twitter. I felt Twitter’s support at this time and there were just many Nigerians from within and outside the country in the tech and advocacy ecosystem, who did everything to ensure that so I had that level of support. Honestly, I think my supporters may be the silent majority. Most of them just don’t say anything publicly for fear of backlash. But when I meet people privately, they are following my work and telling me how much what I do means to them. These people who support me publicly and privately are my support system. My family is amazing. My husband is a huge support. My husband is a gentle giant, he won’t speak or hurt a fly, just as long as you don’t touch Aisha.” 

Aisha thinks stakeholders are not doing enough to keep citizens safe online, especially women and girls who suffer disproportionately from online abuse as a continuum of the systemic inequalities and barriers faced offline. 

The Nigerian Cybercrime Act of 2015 does not specifically criminalize cyber violence against women, gender-based slurs, or misogyny online. Still the law addresses certain important aspects of cyber violence in mentioning ‘cyberstalking’ and ‘pornographic messages.’ However, there are no clear reporting mechanisms, the police hardly takes reports of violence against women online seriously, and official rate of reporting is not known. Evidence suggests that women are generally more prone to behaviours such as self-censorship or not using social media or data due to online safety concerns. In the recent context of COVID-19, the restrictions on movement and the spike in the rates of violence against women (both offline and online) limited access to justice, support systems and social services even more severely. 

Aisha wants a safer environment for people online, in order to bring more diverse voices to the online space and to ensure freedom of expression for everyone.  In terms of coping with online violence, Aisha doesn’t think much help is coming from ‘outside’ anytime soon. She thinks it’s best to focus on building internal resilience meanwhile and this is Aisha’s way of dealing with bullies. “Twitter is a bit more proactive than the rest. The issue with my Facebook account has still not been resolved till date. Sometimes, it seems nobody is really listening and nothing is being done. It is very important for people to know and be accepting of who they are and not be on social media to seek external validation. You have to get to the place where you can accept and assert yourself, flaws, imperfections and all. Knowledge is also very important. That’s one of the key things that you can use to engage meaningfully online. Learn more, read more, and be very open. And at the end of day, you know, develop that thick skin.” 

This story was told with assistance from Tech Societal, a member of the Women’s Rights Online (WRO) network.

The post Online Gender-Based Violence Story – Aisha, Nigeria first appeared on World Wide Web Foundation.

]]>
Asia-Pacific IGF 2022: Takeaways on Tackling Deceptive Design Across the Asia-Pacific Region https://webfoundation.org/2022/11/asia-pacific-igf-2022-takeaways-on-tackling-deceptive-design-across-the-asia-pacific-region/?utm_source=rss&utm_medium=rss&utm_campaign=asia-pacific-igf-2022-takeaways-on-tackling-deceptive-design-across-the-asia-pacific-region https://webfoundation.org/2022/11/asia-pacific-igf-2022-takeaways-on-tackling-deceptive-design-across-the-asia-pacific-region/#respond Wed, 30 Nov 2022 12:39:32 +0000 https://webfoundation.org/?p=19438 This blog post was written by Kaushalya Gupta, Policy Program Manager and Lead on the Tech Policy Design Lab on Tackling Deceptive Design and Moving Towards Trusted Design

The use of deceptive designs, also known as ‘dark patterns’, is harming …

The post Asia-Pacific IGF 2022: Takeaways on Tackling Deceptive Design Across the Asia-Pacific Region first appeared on World Wide Web Foundation.

]]>
This blog post was written by Kaushalya Gupta, Policy Program Manager and Lead on the Tech Policy Design Lab on Tackling Deceptive Design and Moving Towards Trusted Design

The use of deceptive designs, also known as ‘dark patterns’, is harming consumer safety, privacy, and trust around the world. Default design practices like these are widespread in the Asia Pacific, preventing 60% of the world’s population from leveraging the internet to its full potential.

The Web Foundation’s panel and workshop at the Asia-Pacific Internet Governance Forum, held in Singapore in September 2022, brought together around 100 participants, offline and online, comprising researchers, policy experts, designers, entrepreneurs, and members of civil society. Together, we mapped out the gaps in designing for digital products and platforms and the particular challenges for the people in the region, identified major blocks in existing efforts, and discussed solution models that combine design, advocacy, and innovative policymaking to advance an alternative future with trusted design.

 Sage Cheng, Access Now (left) and Kaushalya Gupta, Web Foundation (centre) at the Asia-Pacific Internet Governance Forum session moderated by Anju Mangal (right)

The following five takeaways represent key challenges and opportunities in tackling deceptive designs in Asia-Pacific:

  • The term may be unfamiliar, but the experience is widely shared: While the terms ‘deceptive design’ and ‘dark patterns’ were relatively new to the audience at the Internet Governance Forum, almost every participant in the workshop had lived experiences of being tricked by deceptive designs. In addition, participants were appreciative of Web Foundation’s shift to the term ‘deceptive design’ from the original term ‘dark patterns’, which also served as an impetus for Harry Brignull who coined the term ‘dark patterns’ to adopt the new term.
  • Deceptive design leads to disempowerment: Beni Chugh’s research explores common deceptive designs and their adverse effects on consumers in India through a financial inclusion lens. In short, users are being misled into signing up for financial products they may not need. Additionally, deceptive designs interfere with democratic processes and influence citizens’ political choices. As a result of these incursions into users’ autonomy, web users are largely disenfranchised and disempowered. Beni Chugh’s research paper concludes with open questions that India must contend with when regulating deceptive designs.  
  • Deceptive design comes with a cost: Chandni Gupta’s presentation looked closely into ten of the most common deceptive designs in Australia, based on the recent Duped by Design report published by the Consumer Policy Research Centre. These practices take a toll on the emotional wellbeing of users, their finances, and the control over their personal information, and can also come with a cost to businesses. While 83% of Australians surveyed in the study have experienced one or more negative consequences of deceptive designs, only 58% were aware that organisations use deceptive designs to influence them to behave in a certain way. A consumer-centric approach from businesses, regulators and government could help mitigate consumer harm. 

During the co-creation workshops, public gatherings at conferences and events, consultations, and key informant interviews undertaken as part of the Web Foundation’s research, it emerged that designers do not want to make deceptive design, and most platforms do not want to perpetuate these practices but are somewhat constrained by growth models in a data-driven digital economy. Therefore, solutions must be discussed openly with stakeholders representing different sectors and regions in order to be agreed upon and implemented.

  • A checklist of dos and don’ts can come in handy for designers and beyond: Sage Cheng gave an overview of the Dos and Don’ts Checklist that is currently under development for designers to consider in their UX/UI practices. Thigs list is an ongoing collaborative effort of a group of designers, researchers, technologists, and civil society members convened at gatherings such as RightsCon, Interaction 22, MozFest, and the Web Foundation’s Tech Policy Design Lab

By taking a design-led approach, the Web Foundation’s Tech Policy Design Lab is currently developing a portfolio of UX/UI prototypes to demonstrate a set of ‘trusted design’ principles. These principles were co-created by stakeholders across the globe through a series of workshops held in collaboration with 3×3 and Simply Secure. According to these design principles, digital products and platforms should respect human rights, ensure equitability and accessibility, stay informative and transparent to people who are using them, prioritise at-risk communities, and keep the product experience burden-free. 

Deceptive design has been a difficult problem to tackle, partially because the language we use to address deceptive design means different things to different stakeholders. Also, the interventions often come in silos, whether it is policy and regulations or ethical design initiatives in the Asia-Pacific region, a challenge trusted and responsible design advocates are grappling with globally. While there are several actors working in similar spaces and issues, they are often not seen to be working on the ‘deceptive design’ or ‘dark patterns’ issue per se. Participants were encouraged to continue collaborating on this issue at the Web Foundation’s Tech Policy Design Lab. We will submit the outcomes of the Lab to the UN Global Digital Compact to lay out shared principles for an open, free and secure digital future.

To learn more about the deceptive design issue in the Asia-Pacific region, we have put together a reading list below:

Australia

India

Korea

New Zealand

South-East Asia

The post Asia-Pacific IGF 2022: Takeaways on Tackling Deceptive Design Across the Asia-Pacific Region first appeared on World Wide Web Foundation.

]]>
https://webfoundation.org/2022/11/asia-pacific-igf-2022-takeaways-on-tackling-deceptive-design-across-the-asia-pacific-region/feed/ 0
Online Gender-Based Violence Story – Isabella, Peru https://webfoundation.org/2022/11/online-gender-based-violence-story-isabella-peru/?utm_source=rss&utm_medium=rss&utm_campaign=online-gender-based-violence-story-isabella-peru Mon, 28 Nov 2022 13:01:04 +0000 https://webfoundation.org/?p=19436

When online gender-based violence is of an intimate nature, and takes the form of “revenge porn” or other sexual attacks, it is often very difficult for survivors to come forward, out of a sense of embarrasment or shame. Isabella’s story

The post Online Gender-Based Violence Story – Isabella, Peru first appeared on World Wide Web Foundation.

]]>

When online gender-based violence is of an intimate nature, and takes the form of “revenge porn” or other sexual attacks, it is often very difficult for survivors to come forward, out of a sense of embarrasment or shame. Isabella’s story shows that, with courage, survivors can come forward to educate others about their experiences, and that solutions need to place the blame on the perpetrators and deal with their actions.

Isabella is a 17 year-old teenage girl. She was born in Jauja, a region in the central highlands of Perú, in a traditional household. They moved to Lima when Isabella was a child and was about to start school. She’s the oldest of 3 siblings and since there’s a 10 year gap between her and her youngest sister, she has been involved with her care and upbringing. She’s seen as a role model in her family and has developed a strong sense of security, which brings with it a lot of pressure.

She lives in Carabayllo, in the northern area of the city of Lima. Carabayllo is known to be a lower-income district. Internet connection is not very good there and expensive given her family’s means,, and there are not many options of service providers available. During the pandemic Isabella and her classmates had a rough time adapting to online classes and feel as if they have lost a lot of time. This makes Isabella frustrated about her future and overall education, and impacts her motivation.

But even in this difficult context, Isabella faced an even more acute and distressing challenge. She recently discovered that her ex-boyfriend – who she broke up with 2 months ago because he was becoming increasingly possessive – has shared intimate photos of her online.

She felt very ashamed and didn’t know who to talk to. Her family was not an option, because her parents didn’t approve of her relationship and its potential impact on her studies and family responsibilities. But when she met him, she felt free and enjoyed his company, and began to trust him very quickly. As a teenager, newly exploring her sexuality, she shared intimate photos with him, thinking it was a safe way to express herself.

Knowledge is power
At first, she didn’t even know how to put into words how she felt. She knew she had been betrayed, but also felt guilty for trusting in him in the first place. She decided to confront him and ask him for explanations. However, before doing that, she began researching online for guidance on the implications and consequences of her choice.

Her research confirmed that she had been a victim of a type of gender-based violence. She also learnt that she would potentially have to deal with her images becoming viral, and not knowing who had access to her images. She prepared herself for retaliation if she confronted him, and for the situation to become worse, but knew it was her only option and chance to hold him accountable for his actions. As expected, her confrontation was met with threats and further abuse.

This story was told with assistance from Hiperderecho, a member of the Women’s Rights Online (WRO) network.

The post Online Gender-Based Violence Story – Isabella, Peru first appeared on World Wide Web Foundation.

]]>
Online Gender-Based Violence Story – Maria, Peru https://webfoundation.org/2022/11/online-gender-based-violence-story-maria-peru/?utm_source=rss&utm_medium=rss&utm_campaign=online-gender-based-violence-story-maria-peru Mon, 28 Nov 2022 13:00:12 +0000 https://webfoundation.org/?p=19437

All too often, online gender-based violence is perpetuated by those one knows intimately. Maria’s story illustrates the impact of this, and gives insight into measures that can be taken to protect women from their abusers, and the power of women

The post Online Gender-Based Violence Story – Maria, Peru first appeared on World Wide Web Foundation.

]]>

All too often, online gender-based violence is perpetuated by those one knows intimately. Maria’s story illustrates the impact of this, and gives insight into measures that can be taken to protect women from their abusers, and the power of women working together to achieve change.

Maria is a 25-year-old university student, who grew up with her family in a low-income neighbourhood in Lima, Peru. She loves art, and has ambitions to one day be a well-known and respected reporter. To this end, she decided to go to university to pursue this dream.

In her first year of university, Maria met and started a relationship with Luis, a fellow student. After Maria ended their relationship, Luis began to threaten and harass Maria, hiding behind a series of fake email addresses and social media accounts in order to do so. He resorted to blackmail, threatening to publicly share intimate material they shared during their relationship if she refused to go back to him and rekindle their relationship.

Although this abuse took place in online spaces, the fear Maria felt started to impact her daily activities and behaviours. She became scared of him appearing in the same classes at university, and she began altering her movements to avoid bumping into him. Eventually, Maria decided to confide in a friend.

A united front
Fortunately for Maria, she was not alone, and within her Faculty at university there were many other women who were already calling on university authorities to address all forms of gender-based violence which occur on campus. The Women’s Assembly, a space which brings together feminist activists, had already started conversations and proposed solutions to create a safe safe for women on campus, free from harassment, to allow women to study and enjoy university life. Through her friend, who led the Assembly, Maria’s story was added to the evidence being gathered to illustrate the terrible impacts that gender-based violence,

both offline and online, can have on women, especially in an academic environment. At Maria’s university, this cause was further championed by a feminist professor, who could influence decison-makers.

With the help of this professor, and through the efforts of her fellow students, Maria’s concerns were heard, and taken seriously by university authorities. Although official university policy around sanctions for all forms of gender-based violence still need to be altered, measures were put in place to prevent Luis from enrolling in the same courses as Maria, allowing her to focus on her studies and future career.

The Assembly will continue to call on authorities to codify these measures in official policy, and offer other types of support, including legal and psychological assistance. These measures will remove the pressure from students who experience abuse and who, like Maria, have to make special arguments in order to be protected. But the future looks bright: the Assembly continues to grow, and stories like Maria’s show that simple solutions focusing on sanctioning the perpetrators of online gender-based violence can have real positive impact on their victims.

This story was told with assistance from Hiperderecho, a member of the Women’s Rights Online (WRO) network.

The post Online Gender-Based Violence Story – Maria, Peru first appeared on World Wide Web Foundation.

]]>
Online Gender-Based Violence Story – Gifty, Ghana https://webfoundation.org/2022/11/online-gender-based-violence-story-gifty-ghana/?utm_source=rss&utm_medium=rss&utm_campaign=online-gender-based-violence-story-gifty-ghana Mon, 28 Nov 2022 12:59:16 +0000 https://webfoundation.org/?p=19434

High-profile women such as journalists and politicians are regular targets of online gender-based violence, especially when it is weaponised as a tool to intimate and silence. Gifty’s story illustrates the power of courage and resilience.

Gifty* is a broadcast journalist …

The post Online Gender-Based Violence Story – Gifty, Ghana first appeared on World Wide Web Foundation.

]]>

High-profile women such as journalists and politicians are regular targets of online gender-based violence, especially when it is weaponised as a tool to intimate and silence. Gifty’s story illustrates the power of courage and resilience.

Gifty* is a broadcast journalist in her early 30s who lives in Accra. She has always been driven by a desire to improve the lives of others in her community, and uses her radio show to share stories related to service delivery, political corruption, and other sensitive issues that affect the lives of people. In high school she discovered the power of the media, when a local female reporter’s efforts and spotlight on sanitation issues in Gifty’s neighbourhood led to improvements and commitments from local authorities – something the community had not been able to achieve by themselves. From that moment, she knew that journalism would be her future.

Even though Gifty’s life is more comfortable now than the environment she grew in, she is still confronted on a daily basis with inequality and poverty. Her community does not have a health facility or a basic school in the area. Each morning as she drives to work, she sees school children walking several kilometres to go to school. Even though she sometimes picks and drops some of them in school or closer to their schools, she is not able to help them all, and this weighs on her. As she passes through the community each day, she becomes even more fired up to talk about the socio-economic and political issues in the country.

Inevitably, and sadly so, her criticism of those in power, who are failing to meet the needs of the people they govern, have led to attacks and intimidation. While some listeners call into her radio station, others attack her on her social media platforms and personal mobile phone And, arguably because she is a woman, these attacks are not focused on her political views. Gifty faces body shaming, rape threats and insults. The personal nature of these attacks

make Gifty feel particularly unsafe, and, although for the most part the perpetrators hide behind anonymity, Gifty knows that they are real people, with the capacity to hurt her even further, and she has, at times, been too scared to sleep in her own house, knowing that she might be found.

It would be easy for Gifty to retreat into self-censorship – and this was her initial response to this abuse. But then she remembered another inspiring story – about another courageous woman. The story was about a female politician who was aspiring to be the secretary of a ruling political party. In her campaign, she also faced attacks and intimidation, but took control of the situation, identified her attackers, mapped out strategies to outsmart them, and eventually won the position. It might have been fiction, but the message was powerful enough to encourage Gifty to continue fighting – especially as she realised that her silence would mean silencing the issues she had dedicated herself to championing for her community.

So Gifty has claimed back her power. She initially sought help through the few official channels open to her in order to report and block her abusers, but says that platforms have done very little to help. So she went to the police, both to bring her abusers to book and to ensure that she is better protected in the future. She has highlighted this issue at government level too, pushing for better protections for women and children. And amid all of this she has also found professionals who can offer her psychological support, and teach her techniques and tactics for increasing her resilience.

Gifty also knows she is not alone. This is a phenomenon far too common among female journalists, and she has reached out to a support network of others who have faced similar experiences. She is hopeful that with the support of her allies, she can continue to express herself freely and initiate public discourse. She is also hopeful that she will continue to inspire change for other female journalists and female activists.

This story was told with assistance from Media Foundation for West Africa, a member of the Women’s Rights Online (WRO) network.

The post Online Gender-Based Violence Story – Gifty, Ghana first appeared on World Wide Web Foundation.

]]>
Tackling Deceptive Design Across the African Continent https://webfoundation.org/2022/10/tackling-deceptive-design-on-the-african-continent/?utm_source=rss&utm_medium=rss&utm_campaign=tackling-deceptive-design-on-the-african-continent https://webfoundation.org/2022/10/tackling-deceptive-design-on-the-african-continent/#respond Thu, 06 Oct 2022 16:56:52 +0000 https://webfoundation.org/?p=19368 This post was written by Adedolapo Evelyn Adegoroye and Victoria Adaramola, Tech Hive Advisory. We’d like to thank our partners at Tech Hive Advisory for their work spearheading deceptive design research using participatory methods across Africa. Follow Tech Hive

The post Tackling Deceptive Design Across the African Continent first appeared on World Wide Web Foundation.

]]>
This post was written by Adedolapo Evelyn Adegoroye and Victoria Adaramola, Tech Hive Advisory. We’d like to thank our partners at Tech Hive Advisory for their work spearheading deceptive design research using participatory methods across Africa. Follow Tech Hive Advisory on Twitter @HiveAdvisory


Tech Hive Advisory and Ikigai Innovation Initiative, in collaboration with the World Web Foundation and the Tech Policy Design Lab, hosted the Deceptive Design Policy Hackathon in August 2022. The hackathon was a two-day virtual event aimed at raising awareness about deceptive design as well as co-creating a policy solution and deceptive design guidelines for the African continent. 

During the hackathon event, we had five teams of participants from all over Africa present their policy findings and solutions after conducting their research. The hackathon also featured global experts and African regulators who shared timely insights and recommendations for combating deceptive designs in the continent. The event produced some of the following recommendations: 

  • Legislating against deceptive designs 

The absence of a comprehensive framework makes it difficult to determine what constitutes a deceptive act or manipulative technique and the appropriate penalty for deceptive design practices. A multistakeholder approach is recommended to capture the scope of deceptive designs under the law to ensure regulators are not creating laws in a vacuum. The legal framework could either update a number of existing laws to protect against misleading designs or use a single rule or set of guidelines to deal with them. Also, the law will make it a legal requirement for people in the industry to recognise digital rights as human rights.

  • Design audit tool and process  

Regulators can develop independent audit processes that digital platforms can use to check compliance. A design impact assessment test could be mandated for new and existing products to check for manipulative designs and correct them. The tool or process will judge the impact of design and serve as a foundation for regulation.

  • Complaints mechanism 

There needs to be a transparent mechanism for reporting deceptive designs to regulators. More importantly, the mechanism should be quick to respond to reports of deceptive design and look into them. In addition, reports on investigations and decisions should be published periodically.

  • Awareness creation and advisories  

Regulators, civil society, and academia must shoulder greater responsibility for educating users online and researching manifestations of deceptive designs on the African continent respectively. This would give users a better understanding of their rights and the ability to boycott abusive services. 

  • Use of regulatory sandboxes

There should also be regulatory sandboxes that companies can use to test out their products. The goal will be to ensure that products can be made within the rules from the start.

  • Unified regulation 

There needs to be a harmonisation of standards with regard to design across Africa. Many digital platforms either operate or aspire to operate across countries and continents. Thus, fragmented frameworks make it more difficult and less enticing for tech companies who operate cross-country to comply. The regulator should develop operational guidelines, codes of ethics, and best practices for UI/UX designers, tech associations, software engineers, digital marketers, and tech companies. The codes of practice embody design principles of no surprises, transparency, fairness, and accountability and set defined standards that a user interface must meet.


Post-hackathon 

We were happy to work with a group of researchers across Africa to develop a policy brief and set of guidelines that regulators can use to tackle deceptive designs. Our policy brief provides insight into the state of play and policy of deceptive designs in Nigeria and Kenya as case studies and examines various expressions of deceptive designs on e-Commerce, e-Transactions, and flight aggregation platforms. 

The policy brief also adopted a number of the recommendations suggested during the hackathon. Our policy guidelines will also aim to create a set of rules that regulators in various countries can use to make the digital world safer and more trustworthy.


To learn more about the Web Foundation’s work on trusted design, visit techlab.webfoundation.org. Our team is working to synthesize findings from our workshop series that brought together stakeholders from across sectors and across the globe to co-create solutions to deceptive design. Stay tuned for more details in our upcoming publication.


For more updates, follow us on Twitter at @webfoundation and sign up to receive our newsletter.

Tim Berners-Lee, our co-founder, gave the web to the world for free, but fighting for it comes at a cost. Please support our work to build a safe, empowering web for everyone.

The post Tackling Deceptive Design Across the African Continent first appeared on World Wide Web Foundation.

]]>
https://webfoundation.org/2022/10/tackling-deceptive-design-on-the-african-continent/feed/ 0
A web for everyone: strengthening accountability for online gender-based violence https://webfoundation.org/2022/09/a-web-for-everyone-strengthening-accountability-for-online-gender-based-violence/?utm_source=rss&utm_medium=rss&utm_campaign=a-web-for-everyone-strengthening-accountability-for-online-gender-based-violence https://webfoundation.org/2022/09/a-web-for-everyone-strengthening-accountability-for-online-gender-based-violence/#respond Wed, 21 Sep 2022 14:11:59 +0000 https://webfoundation.org/?p=19347 This post was written by Katherine Townsend, Director of Policy (Interim). Follow Katherine on Twitter @DiploKat.


When Sir Tim Berners-Lee, co-founder of the Web Foundation, invented the World Wide Web, he intended to open the technology of the …

The post A web for everyone: strengthening accountability for online gender-based violence first appeared on World Wide Web Foundation.

]]>
This post was written by Katherine Townsend, Director of Policy (Interim). Follow Katherine on Twitter @DiploKat.


When Sir Tim Berners-Lee, co-founder of the Web Foundation, invented the World Wide Web, he intended to open the technology of the internet for the appreciation of everyone around the world. He reinforced this vision at the opening of the 2012 Olympics, tweeting of the web: “This Is For Everyone”.

The Web Foundation’s mission is to realize this original vision, and work for a safe, trusted, and empowering web. Of the greatest threats to this web we want is the pervasive presence and impact of gender-driven harassment and violence online. 

Women, and other minoritized genders, are harassed everyday due to their gender. Over 50% of the global population experiences or witnesses an online space that allows harassment and violence without consequences. The only response that will keep women safe at the moment is too frequently for them to remove their profiles and presence online. This may not be the first step—they file complaints with platforms, with police—but without a response many choose to take their presence, their voice, their leadership offline. Increasingly, the web becomes a place that is not for everyone, but one that is absent of women, especially women from marginalized communities including Indigenous communities, darker skinned women, LGBTQ+ communities, and minoritized ethnic groups within different countries, and a place that is populated with those who feel encouraged to abuse and attack. This leads to a dangerous World Wide Web, filled with fragmented spaces as more seek to find their own sources of comfort and support in virtual spaces—for good or for ill.

Some of the tech companies have recognized the gravity of this problem and have made public commitments to counter online gender-based violence. For the past year, Meta, Google, Twitter, and TikTok have worked with the Web Foundation and with global civil society, particularly with gender justice and digital rights organizations, to improve their reporting, increase autonomy for users, and share transparent data about their progress. These companies announced their commitments over a year ago at the United Nations Generation Equality Forum, and this week during the UN General Assembly, we are sharing our accountability report detailing on the progress made by these companies, what they’ve been able to accomplish, what barriers remain, and what value Web Foundation has in supporting this work.

We found that most of the companies have made some progress, primarily on reporting and on user interface. The collaboration between Twitter and Google on the harassment manager tool helps support any user in regaining control of their own platform and reporting. TikTok and Meta have also made some inroads into easing reporting and allowing users to limit or block abusive interactions. These changes are promising, and we want to see more. We continue to be most encouraged by Twitter’s open api. While not a development of the past year, this important tool allows entrepreneurs outside of the company to build their own solutions to protect against violence online and allows access for independent researchers, both key to tackling this problem.

Where we’re still falling short is transparency about the magnitude of violence and harassment online, and what impact any of the implemented changes are having on the number of instances and the degree of harm. To address this, we require a global and multi-stakeholder effort to agree on the standard of information and level of detail companies must share and develop mechanisms to make this information publicly available. In speaking and working with global civil society, we’ve also heard that the fundamental approach to improving the platform—one focused on what happens after the harassment occurs—relies on individuals to learn a new system of reporting their harassment, rather than taking a prevention-centered approach. Reporting systems are designed by the platforms with zero to minimal input of the communities most affected by online gender-based violence and a lack of consideration of local context outside of the Global North. Designed in this way, the act of reporting harassment can serve to retraumatize, and without clear response and action, can cause more harm than good. 

The Web Foundation is committed to working across tech platforms, global civil society, and other key partners to build a better web for all—the multi-dimensional nature of OGBV requires a global multi-stakeholder solution. By convening these networks and designing better policies and co-creating products, the web can be a single space that is safe, trusted, and empowering. A web that is for everyone. 


Explore the report

The Web Foundation is committed to working in partnership with organizations across the globe and wants to acknowledge the commitment and effort contributed by a large range of stakeholders. In particular, we would like to acknowledge the engagement of global civil society organizations, tech companies, facilitators, and report contributors. 

The collaboration of organizations and individuals who contributed to the writing of this report included Katherine Townsend, World Wide Web Foundation; Gabriela de Oliveira & Hilary Watson, Glitch; Manon Desert & Luisa Braig, Social Finance; Paulina Ibarra, Fundacion Multitudes; Marianne Diaz, Access Now; Juliet Nanfuka, Collaboration on International ICT Policy for East and Southern Africa; and Marwa Azelmat, RNW Media.


For more updates, follow us on Twitter at @webfoundation and sign up to receive our newsletter.

To receive a weekly news brief on the most important stories in tech, subscribe to The Web This Week.

Tim Berners-Lee, our co-founder, gave the web to the world for free, but fighting for it comes at a cost. Please support our work to build a safe, empowering web for everyone.

The post A web for everyone: strengthening accountability for online gender-based violence first appeared on World Wide Web Foundation.

]]>
https://webfoundation.org/2022/09/a-web-for-everyone-strengthening-accountability-for-online-gender-based-violence/feed/ 0
Building Blocks for OGBV Accountability  https://webfoundation.org/2022/09/building-blocks-for-ogbv-accountability/?utm_source=rss&utm_medium=rss&utm_campaign=building-blocks-for-ogbv-accountability https://webfoundation.org/2022/09/building-blocks-for-ogbv-accountability/#respond Wed, 21 Sep 2022 13:20:00 +0000 https://webfoundation.org/?p=19262 Through the engagement with civil society, tech companies and government stakeholders, we identified six core building blocks that are required for greater OGBV accountability. All of the building blocks complement and reinforce each other while looking for “solutions beyond women …

The post Building Blocks for OGBV Accountability  first appeared on World Wide Web Foundation.

]]>
Through the engagement with civil society, tech companies and government stakeholders, we identified six core building blocks that are required for greater OGBV accountability. All of the building blocks complement and reinforce each other while looking for “solutions beyond women self censoring”, which is an increasing trend in the face of increasing OGBV. Many individuals, communities and organizations are already working across these building blocks to improve OGBV accountability.

Regulation and Enforcement Mechanisms

Regulatory bodies play a crucial role in placing clearer duties on tech companies and broader internet intermediaries to increase transparency and protect women and gender diverse people online. To do this, they need to acknowledge and respond to OGBV specifically, including amending existing legislations through a gender lens as well as environments that allow for OGBV (i.e. weak or non-existing Data Protection regulation). Equally important is the establishment of effective and fair enforcement mechanisms that are grounded in principles of diversity, equity and inclusion. 

Public Awareness and Scrutiny

Establishing an enabling social and cultural environment is fundamental to achieving greater OGBV accountability where people are duly informed on the role they play in fighting against the culture of impunity surrounding OGBV. Leveraging the power of the people to act as a watchdog can be done through citizen/investigative journalism, research, campaigning and media coverage that are representative of women in all their diversity.

Influencing Business Models

As profit-driven entities, tech companies are influenced and incentivised by market dynamics. These can be leveraged for greater accountability through for example, increasing peer competition between tech companies, or incorporating a gender-lens into due diligence or investment criteria.

Forums for Multi-Sector Collaboration

To strengthen existing and new links across sectors and establish forums for collaboration can lead to increased understanding and trust. These networks help organizations to hold each other to account, and support the collective drive towards change.

Transparency and Data

The lack of data sharing and transparency from tech companies is a huge barrier to accountability and progress on OGBV. To monitor progress, common standards and principles for transparency and data sharing need to be established across national and regional levels. This includes a common data platform for OGBV with clear indicators and metrics for measurement.

Digital Literacy and Citizenship

Increasing accountability on OGBV also requires individuals to have an increased level of digital literacy and behave as digital citizens, holding each other to account on their online behaviours and interactions. Governments and tech companies have a responsibility to prioritise investment into digital literacy and citizenship programmes, to support communities to meaningfully participate on their platforms without putting the onus of safety on them. Much of this work is currently happening in the non-profit sector without appropriate investment. 

The post Building Blocks for OGBV Accountability  first appeared on World Wide Web Foundation.

]]>
https://webfoundation.org/2022/09/building-blocks-for-ogbv-accountability/feed/ 0