GPT-3, is an AI program, can write essays, op-eds, tweets, and dad jokes. It will change how we think about creativity. Who is “we”? Doesn’t Alexi deal with this kind of stuff? Leave me alone so I can get back to my opiates.
There is an unreconciled circumstance when AI becomes judge and jury in our society: prejudice. Not necessarily the headline gathering prejudices like racism and misogyny but prejudices we don’t know we have. For example, app programmers working for financial firms may include biased code that is beneficial to finance firms just as a matter of business rather than allowing a fair integration with societal mores.
Several studies already are in that show existing government programs arrive at different decisions based on assets, neighborhoods and cultural differences. To wit: roads and the Interstate system always have chosen less expensive neighborhoods to build the highways. Government policies also are prejudiced by NIMBY politics (Not In My Backyard). And finally, urban development regulations allow venture capitalists to buy up inexpensive land inhabited for many generations by unique subcultures.
How will AI make sensitive, on-the-edge decisions? Mariner spent enough years in the automated data world to know that more than enough data will be available; it’s the analog formulas where the rubber meets the road.
Today, cultural change is in the hands of the owners – the citizens. As everyone has learned, change is nasty, confusing and final expectations are unknown. Computerized data, no matter how hard it tries, cannot emulate values in a topsy-turvy world – unless humans surrender reality to the Matrix.
Ancient Mariner