National Center for Women & Information Technology (NCWIT), 25% of the computing workforce was female in 2015.
Hiring Across Technical roles: It’s an imbalance that’s reflected in company diversity reports. Microsoft, for example, boasts 29% female workers across its staff, but in technical positions only 17% are women. Of Google’s senior management and executive officer team, 17 are male while only three are women. Men make up 83% of Google’s engineering staff; Apple’s technical team is 80% male
Global Level:
Personal Level:
Model Training
Integrating with the Slack Platform
Language Modelling: Bengio's Work in 2003
consists of a one-hidden-layer feed-forward neural network that predicts the next word in a sequence
Word2Vec: In 2013, Mikolov et al.proposed two architectures for learning word embeddings that are computationally less expensive than previous models
CBOW: Continuous Bag of Words uses surrounding words to predict the centre word
Skip Gram: uses the center word to predict the surrounding words
I
N
P
U
T
W
O
R
D
2
V
E
C
L
S
T
M
F
C
N
O
U
T
P
U
T
Beginner: Use the code in github for toxic comment and build it for reddit channels
Create Analysis of your reports: What should people learn from it?
Please feel free to push any sort of data or remark you find suitable for this project. I believe that sexism in workplace conversations is a global problem. I hope that this bot will be a step towards eliminating it. Therefore, I need crowdsourced to improve the model and make it more usable
Crowdsourcing data
Unfortunately, there are no proper datasets for casual sexist remarks at workplaces. The available datasets that I have come across are extremely vulgar/obscene. However, sexist remarks in workplaces are often subtle or contain some usual phrases like "Girls are like that".
Adam Shamsudeen- For helping me research data and finding the deep pavlov data