00:08Since this is a talk on Gender, uh, this talk
will include bits and pieces of, uh, sexual
00:14harassment, violence and abuse.
00:15So, please give yourself the space to think
deeply about some of these issues.
00:20All the data that we're representing here
has been vigorously redacted by a privacy
00:25and anonymization team and uh, we are only
using pseudonyms here and public images.
00:31And, like all previous speakers, I'm only
representing the very tip of the iceberg here.
00:35This is nearly eight months of research.
00:38So, India is a country with 48 percent female
population, that's nearly half the population.
00:4429 percent of the Internet users are female.
00:47So, that means, for every female data points,
there's 2.4 male data points.
00:52And so, when we tend to look at, um data,
as an aggregate, then we are particularly
00:59looking at the male usage behavior.
01:01And the aggregate doesn't represent people
at the margins.
01:06So, if we don't pay attention to subgroups,
then we continue to marginalize them.
01:12So, our call to action to you is to listen.
01:15Listen to these subgroups people at the margins
but we've been working on this project to
01:20make it easier to give you a bit of a head
start.
01:24So, we spoke to people who identified as females
in six countries: India, Pakistan, Bangladesh,
01:31Nigeria and currently doing research in Brazil
and Mexico.
01:34And spoke to over 200, uh, participants.
01:37We used a mixed-methods approach of qualitative
research of, uh, interviews, observations
01:43and focus groups and quantitative research
of logs analysis and non-scaled surveys.
01:48And we partnered with the local universities
and the names that you saw earlier.
01:53And we sampled across income levels, across
sexual orientations, religion, geo locations
02:00and we worked with the recruitment agencies
but also with non-profits.
02:04And we wanted to understand the broader ecosystem
of women's access to technology, so we also
02:10included men in our research, in order to
understand how they view women's access to
02:15forms and, um, Internet.
02:17So, this includes brothers, husbands, partners,
fathers.
02:20So, we packaged all of our research into this
one slide.
02:26Access, Content in Community, Privacy and
Safety.
02:29Let's start with Access.
02:30We visited a village in [inaudible] which
is a state in the north of India, with a population
02:36higher than that of Nigeria.
02:38And when we spoke to the men in the village,
they talked about how they viewed women's
02:42access to technology, often associating it
with eloping, bad things like promiscuity,
02:48uh, that arise as a result of access.
02:51So, even when women did have access, their
husbands and fathers would regularly check
02:56on their phones, just to make sure that they
were safe.
03:00So, we should note that many men were genuinely
trying to protect their vulnerable family
03:04members but did it in a way that was reinforcing
patriarchy.
03:08As this gentlemen notes, it's no problem if
women get phones, as long as they're not doing
03:16Mobiles can spoil women.
03:17So, regularly you hear about news incidents
of phones being banned only for women.
03:21And here's a female perspective on the same
issue.
03:25We spoke to Chithra, who learned how to do
makeup and hairstyles from looking at online
03:30videos and she set up a mini beauty parlor
micro business in the village, to which the
03:34village women come to her for birthdays and
marriages.
03:37And she said: When women talk about getting
to access, uh, about seeing the recipe or
03:43a [inaudible] or a henna design, or something
which is very feminine and useful in the eyes
03:47of her man, then they let us use the phone.
03:49So, in other words, uh, she is um, working
her access constraints by playing to the [inaudible]
03:56feminine and productive content as viewed
by her society.
04:00But of course, she's probably isn't just looking
at henna designs and recipes.
04:03She's doing a whole lot more.
04:05Even when we women do get access to technology,
there are other component, factors that influence
04:12Time is something that we assume people have
when they use products.
04:16But the onus on housework and childcare, it
often falls on women, even among working women.
04:22So, in the survey of over 30 countries, women
in India spend five hours per day doing housework.
04:29And this includes child care, washing, cooking.
04:33Men spend 20 minutes.
04:35So, in this survey, women did the most amount
of house-work in the 30 countries and men
04:40did the least amount of housework in that
survey.
04:44So, do our designs provide a-, are perceived
as female friendly, not just in the eyes of
04:50women but obviously, men in our research,
as well, in our design approaches.
04:54Do our designs provide a meaningful experience
in a short amount of time?
04:58A number of [inaudible] is how they really
have like 15 minutes of free time per day.
05:02Can we help them meaningfully engage with
signal and separate noise.
05:07So, assuming they can get to the experience,
what lies ahead?
05:12Is the content relevant to them?
05:15So, some-, a young mother in Bangladesh says,
uh, talks about her pregnancy and she says:
05:19I had so much anxiety.
05:21What will the changes in my body be?
05:24I could not ask these questions to anyone.
05:27And she had a mother and her mother-in-law.
05:29Women often have a need for information on
sensitive topics.
05:34This includes the domain on health, but also
other domains like finance and well-being,
05:40which are highly stigmatized and considered
a woman's knowledge, and that is something
05:45that she should just have, and hence, taboo
to discuss.
05:49So, information platforms can play a crucial
role here in opening a conversation, uh, for
05:55these stigmatized topics.
05:57But people need community as much as they
need information.
06:00They're both fundamental needs and this is
especially true for women that are going against
06:06the grain, breaking norms, those coming from
smaller towns.
06:10So, [inaudible � name] a student in India
who talks about relatable role models in her
06:17She says: When I search online, I can't show
my normal, [inaudible] and say: I want to
06:23My mom needs to see someone who looks like
me, talks my language but is from my community.
06:27But really, it's up to the those women in
computer science.
06:30So, this is further to Ricken's point on how,
uh, people like to see other from their community,
06:36when getting inspired.
06:38And now this-, this relatable role model is
playing a role not just for herself now, but
06:42also for her family members, in helping her
break the norm of young women, um, being over
06:50So, can people easily discover taboo topics?
06:53Your platforms are just use cases like health,
education and finances.
06:58And do we offer the ability to not only discover
community and relatable role models, but actually
07:04So, for the women on our technologies, do
we understand their privacy needs?
07:09And provide the right controls?
07:11When we talk about privacy, we want you to
open up your minds about a slightly different
07:15version of what privacy is.
07:18In the countries where we've done research,
privacy, for the most part, is less about
07:25data collection by organizations and governments.
07:28That does exist for a certain segment, but
predominantly, for women and non-violent people,
07:35it is about what other people can see and
the potential reputation and physical damage
07:41that can arise as a result.
07:44And this privacy is important for women in
both physical and online worlds.
07:48So, we wanna illustrate both.
07:50Here's a vignette of the physical world.
07:52Women tend to share their forms or have mediator
access because of simply lower access to technology.
07:58I shared a garments worker in Dhaka who recently
purchased a SmartPhone.
08:03Told us how her husband regularly checks her
phone, and her sister-in-law's kids borrow
08:09We found, consistently, that the mother's
food is a shared device at home.
08:14And because Isha cannot refuse to give the
phone to someone, she has other strategies
08:19to maintain her privacy.
08:20So, she has an App lock, uh, so the kids can
borrow the phone.
08:25But her social media messages and her photos
are hidden by password.
08:29Women come under great social scrutiny and
shaming for their actions.
08:33So, it's even more important that our privacy
controls work for them.
08:38Even when the app lock is enabled by women,
the very presence of the app lock application
08:42on their phone, leads to questions like: What
do you have to hide from me?
08:46So, we have an opportunity to address this.
08:50Do our technologies allow people to easily
remove traces?
08:54Instead of heavy-handed controls like locks
and profiles, which don't necessarily work
09:00in this fluid environment of social sharing,
can we provide micro controls that offer flexibility
09:06and freedom to people, as they use in different
situations?
09:11And micro privacy states are always useful,
like incognito, but we discovered in our research
09:15that they're very limited to only the tech-savvy
people.
09:19So, how can we help everyone discover them?
09:22Moving on to online information privacy.
09:25This is a beautiful quote from uh, a lady
in Dhaka: Information is uncontrollable.
09:32Once you share anything, you can not take
it back.
09:34So, it should be of no surprise that all of
our participants carefully guard and carry
09:41their personally identifying information online.
09:43Uh, women may hesitate to engage online.
09:47This may mean posting comments, creating content,
uh, creating a full-fledged profile.
09:53Because of stalking, hate and the social judgment
that comes with it.
09:58So, just to reinforce this further, consider
a recent news story that broke out a few months
10:03ago, of how mobile phone numbers were copied
by shopkeepers in the state that we talked
10:09about with the British and sold as bulk packages.
10:12And so people can buy these packages and contact
women and stalk them.
10:18So, the connection that many women made in
our studies � if you have my phone number,
10:22then you'll get access to my, uh, contents
on the phone and you can cause some form of
10:27reputation damage of my identity.
10:30Women hesitate to disclose phone numbers,
full names, declare their gender, location
10:37which are all right vectors of abuse.
10:40And more and more technologies are increasingly
relying on phone numbers and, uh, personally
10:46identifying information which is problematic
for women and, uh, non-[inaudible] communities.
10:51There are useful things that we can do here,
as technology researchers and designers.
10:57Start by asking: If we really need that person's
identifying information, even if it is optional
11:03� uh, do we offer safe alternatives to profile
pictures and names, such as - with other cards
11:09or pseudonyms or nicknames?
11:12And do we allow users to easily find and manage
their privacy settings, instead of them having
11:18to hack our designs?
11:19So, lastly � safety.
11:21Safety is often viewed, as an edge [?] case.
11:24Harassment and abuse are common incidents
that happen.
11:28Even when we spoke to a broad spectrum of
women, safety came up in nearly every interview.
11:32So, Safety here isn't just about online abuse
and the social shaming.
11:37It's an ever-present need in the real world.
11:39One of our participants told us how when she
printed bus tickets, there was a helpline
11:44women's helpline number that was, uh, printed
along with it.
11:48And it automatically assigned women to sit
next to each other.
11:51So, even although she was probably not going
to call that helpline, the very presence of
11:56it has gained her trust.
11:58One-touch panic buttons weren't used by any
of our participants in our research.
12:02But technology is used today in feeling safe
in various physical situations.
12:08Directions on maps are used to double-check
when traveling in public transport.
12:12The camera is used to shoot pictures of the
number plate which are used for accountability.
12:17Often fake phone calls are made to feel safe
in situations, just to show that someone else
12:22is out there and can look out for me.
12:25But many women don't know about these features
or when to use them in various physical situations.
12:31How can we assure that safety features lead
to a meaningful follow up?
12:36So, instead of her: Thank you for your feedback
can we offer something that will actually
12:40enable our users to come back to our technologies?
12:44Inclusive research is just good research.
12:47Start by balancing your gender issues.
12:49Start by creating safe spaces, uh, to talk
to minorities, to understand their points
12:55Analyze by aggregates, but also analyze by
subgroups in order to understand how they
12:59fit into the larger story or do not.
13:01And growth is a good thing to measure in any
initiative or trend, to make your product
13:07Some of the simple metrics that you can use
are: What percentage of your current users
13:13What percentage of your market is female?
13:15And what's the difference in engagement between
the various genders?
13:19So, again, our call to action to you is to
listen.
13:24Uh, listen to the subgroups, listen to the
people at the margins.
13:28Write to us if you'd like us to bring this,
uh, this work to your organization.
13:33We'd be more than happy to do so.