nextupprevious
Next: Scientific information policy Up: Information Policy Previous: Ex:US vs EU privacy

Protection from abuse

 

With the growth of databases the potential for the abuse of information will grow correspondingly. Markets for information will tend to be self-regulating, in the sense that the database manager has no incentive to collect information not pertinent to his clients' decisions; nevertheless, potential problems need to be considered. First is the problem of creating incentives for accuracy in databases. Second is the issue of how much privacy in the sense of control over the release of information individuals should have, given the vast amount of data concerning individuals which can be stored in databases. Third is balancing the needs of secrecy such as privacy, trade secrets or national security against the benefits of better operational decisions.

Accuracy of information  in databases regulated purely by market incentives depends on the value of accuracy to the clients and the cost of obtaining accuracy by the database firm. The interests of accuracy, however, of the database manager, the clients and the subjects are not identical. The database manager will use rules of thumb to equate the cost of obtaining accurate data to the value to the customers. If the accuracy of the database is sufficient to yield better decisions than other alternative approaches then the database has value to the client.

Consider, for example, a database for making credit-related decisions. If a credit database is 99.9% accurate but tends to err in erroneously denying credit, such a database would be useful for credit allocation, since to credit issuers the loss from a bad loan is greater than the loss of foregoing a small number of potential customers. Given an efficient estimator of credit and a large number of subjects, the market would be almost efficient except that the interests of the .1% would be damaged in that they would be denied credit. A low cost procedure to protect the interest of that 0.1% is to grant subjects information rights to correct their files and to establish procedures for resolving disputes.

This approach, in fact, has already been taken in credit files, but the question remains, to what extent should the database manager be liable in a tort action? This question, of course, is difficult to resolve. With no liability the database manager has no interest in increasing the accuracy of subjects' data beyond the accuracy desired by his clients. In fact, because subjects have strong incentives to correct negative information, no liability creates an incentive for the database manager to transfer the problem of accuracy to the subjects. Making the database firm liable for the expected damage of inaccurate data raises the cost of collecting data and consequently the cost of obtaining credit.

The right to learn itself implies no limit to the information database managers could collect about their subjects. As has been pointed out the natural limit is determined by a cost versus value decision. With information rights implying a disclosure of negative information the amount of information to be gathered could become very large. For inanimate objects the prospects of very large amounts of information in databases is not personally threatening, however for humans the collection of this data can lead to possible mental anguish if negative information should be widely disseminated. This aspect of privacy has a common law protection in tort, however, this protection, is directed at the inappropriate dissemination of information rather than the collection and use in decision making. The important issue which needs to be addressed is to what extent should subjects of databases have a right of privacy in the sense of prohibiting the collection and use of personal data.

The proposed compromise between privacy in the sense of restricting the collection of personal data and economic efficiency is that individuals should be protected from the collection and use of extraneous information in databases. The development of implementable criteria to limit the collection of personal data must be based on how decisions are made using databases. Such decisions generally involve an analytic screen using an analytic criteria followed by a second more intensive intuitive evaluation. In making the analytic screen the decision maker must incorporate into his criteria such social concerns as no discrimination based on race, sex, or creed. For analytic decision making the information rights in operational decisions stop at inputs which current knowledge has demonstrated to be relevant. For this purpose relevance is not subjective opinion, but rather relevance is demonstrated by a professionally recognized study.

As knowledge advances, moreover, the requirements for demonstrating relevance will advance. In the future, then, the minimum requirement might be a statistical study based on a nonexperimental design published in a journal subject to a professional peer group review. This possibility implies that while relevance might not always imply causality, it, at least, implies correlation. Data which does not meet this standard is extraneous and should not be collected in databases for operational decision-making.

The creation of databases to assist in the search for new executives or other types of employees illustrates the conflict between obtaining information to create efficient markets, human desires for privacy, and firms' concerns about trade secrets. Efficient job market databases for searching for employees analytically must contain measures of past performance that can be used to predict future performance. This criterion would require that performance information from firms be released to databases. In addition, databases would need industry standards for making performance measures or at least for comparing performance measures of alternative firms. Firms would object to the release of such data on the grounds that other firms would hire away the firm's best employees.tex2html_wrap_inline448

From the perspective of market economics, social justice is an efficient market in that each worker will receive his highest compensation. In a job market situation where employment typically lasts for short time periods, the information requirements for creating efficient markets should override any misguided notion of protecting corporations from competition in labor markets. Each firm would occupy the same competitive position, and compensation would better reflect merit.

But with the emphasis on sufficient information to promote efficient markets, the participants would fear a loss of privacy. In response to this fear the database can contain indicators of performance rather than the raw data. To make this point explicit consider the health of a job candidate for a job of finite duration. A pertinent input in evaluating alternative candidates is the health forecast over the time span of the contract. To resolve the conflict between the need to keep personal data private and market efficiency, assume that entrepreneurs have developed several competing programs that read the pertinent data from the subject's medical record and forecast health. The potential employer is entitled to the forecast but not the raw data. To protect individual's medical records the holder of the medical record would run the evaluation programs as a service and would release only the output.

In order to compete for employment, most individuals would release the summary statistics to the job market database managers. The right of privacy beyond the common law protection directed at the media would be the right not to be compelled to release information for database analysis when no recognized study had demonstrated the pertinence of that data. Some information such as race, creed, or sex might continue to be restricted by law.

Ensuring the individual's privacy from the collection of extraneous data of possible subjective interest benefits the individual only if that right is enforceable. Simply passing laws that database managers cannot collect extraneous data would be difficult to enforce as Fourth Amendment protection limits searches to find extraneous data to probable cause. One scheme which would make the collection of extraneous information much more expensive and would help to ensure individuals released only required information in credential checks is the new cryptographic invention known as a blind signaturetex2html_wrap_inline450.

Another safeguard which would help to make database operations self-enforcing is to grant subjects information rights  to the decision process. To illustrate this point, consider the processing of a loan application by an expert system. Suppose the subject was granted the information right to know what analytic decision rule would be used to make the decision. Using his personal computer he could find out the amount the financial institution must loan him. Failure of the bank to grant the loan would be grounds for possible legal action on the part of the subject against the bank for using another criterion, or extraneous data.

Likewise, if the various analytical screening functions used to make decisions concerning jobs and other human concerns were in the public domain, the participants could police the process themselves without a great deal of government intervention. To ensure equal opportunity the state would require that all job openings be announced by being placed in the appropriate file in the social nervous system and that the screening functions be based on competency and not personal prejudice.

To provide decision makers freedom to act, the employer would be allowed to use for analytic screening any inputs and criteria which had been demonstrated significant for predicting performance. To inhibit lawsuits, he would prepare a short statement of references. The onus would then rest on the challenger to demonstrate that the evaluation criterion or inputs were not significant. If employment decisions occurred primarily through the database screening, then social standards for nondiscrimination would be quickly established.

Many decisions would have both an analytic aspect, in the sense of reducing a large pool to a small set of final candidates, and a subjective aspect, which would be the evaluation of the final candidates. The participants could police the analytic part using the rules for extraneous inputs and valid criterion. In such a competition the losers would know the evaluation criterion and could independently compute their rankings with the winning set. Thus, to satisfy due process, employment decisions would increasingly be based on merit. In time, this merit-based system for making employment decisions will become essential, as international corporations will have to guarantee due process for competitors of vastly different cultures.

Two other safeguards protect individuals and legal entities from time delay in the release of information. To enable the stockholder to analytically evaluate his principal agent and the voter his government official, the right to learn implies information rights for accountability, and as models to analyze principal agent behavior grow in size and complexity the demand for inputs for accountability will grow accordingly. This right conflicts with other informational concerns. For private firms, one way in which this conflict occurs is with trade secrets. To resolve this conflict, some accountability information would have to be released with a time delay. The faster the rate of technological change the shorter this time delay. Voters would have similar information rights to analyze the behavior of government officials. To facilitate decision making by public and private officials, the release of information on decisions not subject to an open meeting policy or disclosure policy would occur after the fact, since the principal protection from day-to-day interference in decision making is the time delay in the release of information.

A second safeguard for individuals is the right to deny physical and electronic access to their communities, a right which was discussed in the chapter on the community. Given a much more open political economy, more explicit defenses against possible information abuse must be constructed. As was pointed out previously, the town has the right to act as an electronic unit in purchasing services. This creates a mask between the members inside and the institutions outside. Secondly, households in town would have the right to limit and electronic access. For example, an individual, if he desired, could determine the number of a caller before deciding to respond. He could program his computer to filter incoming calls to respond to only those he wished to respond in person. Similarly, an individual could defend himself against unwanted ads by simply filtering them out. The right of limit access would provide the individual with the right to be left alone if he so desired.


nextupprevious
Next: Scientific information policy Up: Information Policy Previous: Ex: US vs EU privacy

Fred Norman
Mon 14 Dec 98