# Processing with AI ## Exploration : IA & Ethics Name: >Nicolas Domenech Subject: > Detect dermatological problems using Computer Vision >[TOC] ## Design brief ### Biais If we don't source our dataset with enough rigor, the following biais might appear: >1. The quality of pictures could skew analyses and cause a wrong diagnostic. For example, degenerative beauty spots are often recognized thanks to an inconsistence of the colour thus your pics must be really accurate regarding to colours. >2. It could recognize dermatological problems where no problem is taking place. For example, shadows because of light issues could be recognized as dermatological problems. Thus if database is educated to make the difference between shadows and skin elements it would be more efficient. >3. It could be able to recognize only some skin types. For example, it could be unable to detect dermatological problems on black skin. We will ensure that our model is not biaised by: >1. Sourcing our data from a database which is not diversified in skin types. It has to contain all existing skin types. >2. Making sure our data take into account that other elements which are not skin elements, for example a underwear or jewels. >3. Difference of pictures' resolution. Is the model able to analyse pictures from different resolutions. ### Overfitting >We will make sure our model does not overfit by ordering surgeries. To me this kind of tool should be used in order to help patients to get diagnosis easily to get a first idea of a potential problem. Nevertheless, a patient should only rely on a doctor's opinion. This tool is a good first idea but should not be considered as a 100% reliable tool. The firm promoting this tool should be careful to warn on the fact that patients could use this tool but must get a validation of the diagnosis by a doctor. ### Misuse >We have to remind ourselves that our application could be misused by racist people to do skin classification. In some countries where government want to achieve genocide, they could recognize minorities thanks to this tool if it is equiped with a skin type recognition tool. ### Data leakage >In a catastrophic scenario, where all of our training dataset were stolen or recovered from our model, the risk would be that these pictures could be explicite. As dermatological problems could occur on every part of the body. There will be pictures which could be more or less explicite. For example, some pictures could contain private parts of the body. The company responsible for this tool. ### Hacking > If someone found a way to "cheat" our model and make it make any prediction that it want instead of the real one, the risk would be that he could produce false diagnosis. People could thus think that they have a dermatological problem whereas it is not thus people will go to doctor with a lot of stress for nothing. On the other hand, they could think that they have no problem whereas they should have a surgery.