Mujeres, datos y poder. Una mirada al interior de la economía de las plataformas



Palabras clave:

Infraestructura de datos, plataformas, feminismo, mujeres


Mujeres, datos y poder. Una mirada al interior de la economía de las plataformas.


Andreeva, G., y Matuszyk, A. (2018). Gender discrimination in algorithmic decision-making. 2nd International Conference on Advanced Research Methods and Analytics (CARMA2018).

Cadwalladr, C., y Graham-Harrison, E. (2018). Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian.

Crawford, K. (2013). The Hidden Biases in Big Data. Harvard Business Review.

Data2x. (2021). Important data about women and girls is incomplete or missing.

D'Ignazio, C., y Klein, L. F. (2019). Data feminism. MIT Press.

D'Ignazio, C., y Klein, L. F. (2020). Seven intersectional feminist principles for equitable and actionable COVID-19 data. Big Data & Society, 7(2), 1-6.

Eisenstat, Y. (2019). The Real Reason Tech Struggles with Algorithmic Bias. Wired.

Flexer, A., Doerfler, M., Schluter, J., y Grill, T. (2018). Technical Algorithmic Bias. In A Music Recommender. 19th International Society for Music Information Retrieval Conference.

Forensic Architecture. (2020). The Killing of Zineb Redouane.

Gray, J., Bounegru, L., Milan, S., y Ciuccarelli, P. (2016). Ways of seeing data: Towards a critical literacy for data visualizations as research objects and research devices. En S. Kubitschko y A. Kaun (Eds.), Innovative Methods in Media and Communication Research (pp. 290-325). Palgrave Macmillan.

Gutiérrez, M. (2018). Data activism and social change. Palgrave Macmillan.

Gutiérrez, M. (2019). Participation in a datafied environment: Questions about data literacy. Comunicação e Sociedade, 36, 29-47.

Gutiérrez, M. (2021). Algorithmic Gender Bias and Audiovisual Data: A Research Agenda. International Journal of Communication, 15, 439-461.

Hajian, S., Bonchi, F., y Castillo, C. (2016). Algorithmic Bias: From Discrimination Discovery to Fairness-aware Data Mining. The 22nd ACM SIGKDD International Conference, Rec.

Hao, K. (2019). This is how AI bias really happens-And why it's so hard to fix. MIT Technology Review.

Helmond, A. (2015). The Platformization of the Web: Making Web Data Platform Ready. Social Media + Society, 1(2), 1-11.

Knight, W. (2016). How to Fix Silicon Valley's Sexist Algorithms: Computers are inheriting gender bias implanted in language data sets-And not everyone thinks we should correct it. MIT Technology Review.

Kõuts-Klemm, R. (2019). Data literacy among journalists: A skills-assessment based approach. Central European Journal of Communication, 3, 299-315.

Langston, J. (2015). Who's a CEO? Google image results can shift gender biases. University of Washington.

Pegg, D., y Cadwalladr, C. (2018). US data firm admits employee approached Cambridge Analytica. The Guardian.

Ramsey, L. R., y Horan, A. L. (2018). Picture this: Women's self-sexualization in photos on social media. Personality and Individual Differences, 133(15), 85-90.

Reuters. (2018). Myanmar: UN blames Facebook for spreading hatred of Rohingya. The Guardian.

Rodríguez Martínez, M., y Gaubert, J. (2020). International Women's Day: How can algorithms be sexist? Euronews.

Taylor, L. (2018). As technology advances, women are left behind in digital divide. Thomson Reuters Foundation.

Tolan, S. (2019). Fair and Unbiased Algorithmic Decision Making: Current State and Future Challenges (Digital Economy Working Paper) [Background paper to the European Commission's report: 'Artificial Intelligence: A European Perspective']. European Commission - Joint Research Centre.

Vaitla, B., Bosco, C., Alegana, V., y Wouter, E. (2017). Big Data and the Well-Being of Women and Girls Applications on the Social Scientific Frontier. Data2X.

van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society, 12(2), 2.

Wachter-Boettcher, S. (2017). Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech. Norton & Company.

Wang, E. (2018). Two dangerous visions: What does it really mean for an algorithm to be biased? The Gradient.

Weizman, E. (2017). Forensic Architecture: Violence at The Threshold of Detectability. Zone Books.

Zhao, J., Wang, T., Yatskar, M., Ordonez, V., y Chang, V. (2017). Men Also Like Shopping: Reducing Gender Bias Amplification using Corpus-level Constraints. En M. Palmer, R. Hwa y S. Riedel (Eds.), Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing (pp. 2979- 2989).



Estadísticas en RUA



Cómo citar

Gutiérrez, M. (2023). Mujeres, datos y poder. Una mirada al interior de la economía de las plataformas. Feminismo/s, (42), 13–25.