Why AI regulation will resemble privacy regulation - Economygalaxy

Why AI regulation will resemble privacy regulation - Economygalaxy

You are a walking data repository. While outside your residence or vehicle, walking down a street, shopping in a store, or visiting any type of public event or meeting — you potentially lose your personal privacy and cross the boundary from being a private individual to a virtual public figure.

You can be filmed or photographed, your image can be transported to a storage silo anywhere in the world, your voice can be recorded, and your time in public view can be noted. This is the world in which we live 2022. 

When you go online to make a purchase, there opens a whole new door to others of your personally identifiable information, (PII). You invariably will be voluntarily offering strangers your name, address, phone number, email address, and possibly more extensive information about yourself.

Ostensibly, this data remains private between you and the vendor. “Ostensibly” is the key word here, however; one never really knows how much of your PII stays legitimately private.

Everything cited above can become data and go on your record somewhere in the world, whether you like it or not. Over-the-top severe assessment? Possibly, but it’s up to you to know this and act accordingly. 

What information qualifies as personally identifiable information?

According to the U.S. Department of Labor, (DoL) companies may maintain PII on their employees, customers, clients, students, patients, or other individuals, depending on the industry.

PII is defined as information that directly identifies an individual (e.g., name, address, social security number or other identifying number or code, telephone number, email address, etc.).

It can also mean information by which an agency intends to identify specific individuals with other data elements, such as a combination of gender, race, birthdate, geographic indicator, and other descriptors.

Everything cited above can become data and go on your record somewhere in the world, whether you like it or not. Over-the-top severe assessment? Possibly, but it’s up to you to know this and act accordingly. 

Privacy, AI regulations moving in parallel fashion

Data privacy laws and regulation of data gathered for the use of artificial intelligence are progressing in parallel paths through government agencies because they are so intertwined.

Anytime a human is involved in an analytics project, bias can be introduced. In fact, AI systems that produce biased results have been making headlines.
 

One highly publicized example is Apple’s credit card algorithm, which has been accused of discriminating against women and caused an investigation by New York’s Department of Financial Services.

Another is the COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) algorithm used in U.S. court systems to predict the likelihood that a defendant would become a repeat offender. This one in particular has been wrong numerous times.
 

As a result of all this PII collection, the rapid rise of the use of analytics and machine learning in online applications, and the constant threat of bias in AI algorithms, law enforcement agencies are chasing down an increasing number of complaints from citizens regarding online fraud.
 

Hot Topics

Bank Of China Unveils New E-CNY Smart Contract Test Program For School Education


China Unexpectedly...Data Disappoints


New York Times Tap...ValueAct Challenge
 

Governments too are trying to get their arms around appropriate legislation in statewide efforts to curb this criminal activity Source: venturebeat

 

Left Banner
Right Banner