http://aspenlogandbeetlekillpinefurniture.com/wp-json/oembed/1.0/embed?url=http://aspenlogandbeetlekillpinefurniture.com/2016/03/pine-beetle/ Michal Kosinski, a professor at Stanford, made research suggesting that AI can effectively guess your political views, IQ, personality traits, sexual orientation and many other. All it needs is a photo of your face.
buy gabapentin canada Thanks to Machine Learning and Deep Learning, computers are much better at recognizing images. In the near future, we will see extraordinary and disturbing applications of facial detection technology which could raise many social consequences as well as ethical questions (How one can prevent the erosion of privacy with such algorithms?).
finasteride cheaper than finasteride “The face is an observable proxy for a wide range of factors” – he said.
As you can imagine, Kosinski’s research is highly controversial. What if some countries were using these algorithms to persecute gay people? Kosinski explained that his research should be taken as an alert to keep privacy protections as governments already have this kind of technology.
The algorithm can also tell your political views with a pretty good accuracy only by seeing your face. It may be revealing that political opinions are heritable. If political learnings are effectively linked to genetics, then it can explain the detectable facial differences.
Moreover, it can detect the IQ of a person just with a photo of his face. Just imagine what would happen if the AI could reveal which children are genetically more intelligent.
The researchers predicted that with a large set of facial images, the algorithm could detect whether a person is a criminal or a psychopath. Obviously, such a technology could be used to cause great harm: what if someone uses biased data? We should highly regulate these kinds of algorithms, especially in criminal justice where a machine could take wrong decisions on prison sentence or release. As I’ve said, if the algorithm is based on biased data from a court that is racially prejudiced, it would cause a tremendous harm.
Nightclubs and sports stadiums could also scan people’s face before entering to detect threats of violence. However, it’s not much different from today’s security guards that make subjective decisions about people before they enter.
Even though the algorithm is highly accurate, one should note that it is not 100% accurate. Consequently, we should consider that there will always be a chance for incorrect predictions which could result in biased conclusions.
Thanks so much for reading, that’s the end of Article 96. Share it with your friends and don’t forget to follow me on social media.