# Processing with AI ## Exploration : πŸ‘©β€βš–οΈ Ethics of AI Name: > Bliukhterova Anastasiia > Subject: > Improve dating apps matching algorithms using NLP >[TOC] ## Design brief ### Bias If we don't source our dataset with enough rigor, the following bias might appear: >1. Racial bias in healthcare risk algorithm >2. Cognitive biases >3. Gender biases We will ensure that our model is not biased by: >1. Training it on both daytime and nighttime >2. Covering all the cases we expect our model to be exposed to (this can be done by examining the domain of each feature and make sure we have balanced evenly-distributed data covering all of it) >3. Ask a colleague to look into the feature(s) we’re considering to discard, afresh pair of eyes will definitely help >4. Exposing the algorithm to a more even-handed distribution of examples ### Overfitting We will make sure our model does not overfit by > Removing all unnecessary features in our App (like video calls or geoposition for example). So we can do it by telling a story about how each feature fits into the dating App. ### Misuse >We have to remind ourselves that our application could be misused by **many unappropriate propositions** in order to **denounce or use** someone in a bad sense. >So, the possibility to access the dating App should be carefully verified (age, photo or video verification etc). It will help us to reduce different types of propositions such as: propaganda, sexual abuse, terrorist impact etc. ### Data leakage >In a catastrophic scenario, where all of our training dataset were stolen or recovered from our model, the risk would be in leakaging all the personal user's data (general info and even private info like messages and photos). Thus, it will lead this particular App to have a bad brand image afterwards. ### Hacking > If someone found a way to "cheat" our model and make it make any prediction that it want instead of the real one, the risk would be that this person will obtain all the wondering private info of a person he would like to hack.