Algorithm Bias- Virtual Workshop
Time: 12:00pm - 3:00pm EST
Date: January 20, 2021

Hosted by Women in Analytics (WIA)

An immersive workshop that enables participants to experience firsthand the unavoidable nature and ill effects of bias in algorithms.

About this Event:

WIA brings together industry leaders, technical experts, entrepreneurs, and academia to discuss the latest technologies, frameworks, strategies, and methodologies proven to be successful while showcasing as speakers the women who have helped develop and execute them.​

Our mission is to provide visibility to the women making an impact in the analytics space and provide a platform for women to lead the conversations around the advancements of analytical research, development, and applications.

This workshop is brought to you in partnership with Singularity University.

During this workshop you will:

  • Use data about you to design and train your own algorithms
  • See how your algorithms encode and perpetuate bias, despite your best efforts to avoid it
  • Directly feel what it is like when you or a peer are excluded by your algorithms
  • Clearly understand how to identify and analyze the mechanisms that lead to bias in algorithms

Participating in this workshop will enable you to:

  • Advocate for more socially responsible algorithms
  • Build more socially responsible algorithms
  • Determine how to manage algorithmic bias systemically at your organization

This will be a highly interactive experience and a more actionable follow-up to our webinar series on mitigating bias in analytics. It follows a hands-on education principle providing you with an opportunity to take the next step toward creating more socially responsible algorithms.

You do not need any formal training to participate in this workshop.  Additionally, there is no requirement for you to have attended the previous WIA webinars. We encourage you to reach out to your employer to support your participation in this workshop!

Cost: $300

To register, click here

Share this page