Hessendata, die Big Data Analyse- und Auswerteplattform der hessischen Polizei, stammt vom umstrittenen US-Anbieter Palantir. Unser Gastautor, Giel Ritzen, hat sich mit der Frage beschäftigt, ob und wenn ja, welche Datenschutzgesetze für Hessendata überhaupt anzuwenden sind. Lesen Sie hier eine Kurzfassung seiner Analyse (in englisch).
HessenDATA is a person-based big-data analysis and evalution system in use by the Hessen Police. Although the system can be used for any ’serious crime‘, ranging from residential burglary to financial crimes . its use is mainly justified to find potential suspects in ‚the fight against (Islamic) terrorism‘ . Hence, the tool can (and most likely IS) used for a variety of criminal activities, ranging in seriousness.
Several contributions have been written on the use of HessenDATA and its potential implications for personal data protection and privacy. These previous contributions mainly concern issues regarding the acquisition of data, i.e., the integration of social media data into the platform . In this contribution, I focus on potential issues that can arise after the data has been integrated into the platform, and is then being analysed.
This paper is structured as follows: First, I shortly discuss the essentials of the architecture of HessenDATA. Then, I continue with the first legal analysis by discussing the applicability of the EU data protection framework to the processing of data in HessenDATA, and which legal instrument applies directly. The second part of the legal analysis contains a discussion on the two potential issues for personal data protection and privacy, that can occur during the analyses in HessenDATA, after which I conclude this contribution.
2 The most essential features of HessenDATA and the entities involved
Before I go on to discuss the most essential features of HessenDATA, it is important to quickly shed a light on Palantir , and their platform Gotham . Palantir is an American IT company that specialises in big data analytics. One of their systems is GOTHAM, an big-data analysis platform, widely in use by US law enforcement and intelligence agencies. GOTHAM promises to integrate all structed- and unstructured data, after which it „can flow into the platform“. . By this, Palantir means that it transforms all data into coherent information objects so that those objects (people, places, things, or events) can all be linked to one another to represent relevant relationships. After the information model has been established, the relationships and assets can be analysed by authorised users . Palantir regards the collaboration between human analysists and the software’s algorithm as the perfect hybrid solution to „explore divergent hypotheses and surface unknown connections and patterns“ .
I will not explain the structure of the platform in detail since that has been done in previous contributions . It is, however, important to highlight the most relevant app for the analysis: AVA. This is an „investigative assistant“ that constantly updates (in real-time) the database with newly processed information and alerts users (human analysts) of new potential connections between information objects. Hence, AVA guides human analysts during their use of the software, by providing recommendations.
It is, however, important to explain a few essential elements of the storage structure of the underlying database, which is similar to the entity-attribute-value-with-classes-and-relationships (EAV/CR)  or this structure. First, every concrete or abstract element contained in the information and representing an object from the real world gets its own ‚information object‚ assigned in the database. Such an information object consists of an information object-type and an object-name. Second, all the relevant identifying and descriptive characteristics of every information object are represented by an appropriate attribute in the database which each consists of an attribute type (like surname, birth data, nationality and the like) and an attribute value (like ‚Miller‘, ‚1998-12-11‘, or ‚British‘). Each of such attributes is then linked to its respective information object in order to express the ‚belonging‘ of such characteristic to the respective object.
Third, if connections between objects in the real world can be detected (for instance an address of a company, a telephone number of an certain individual and the like) a relationship between the respective information objects is created in the database consisting of the ID’s of information object 1 and information object 2, and a name expressing the kind of relationship between such objects (like „address of“, or „phone number of“ and the like. [This process step is like forming simple main sentences of a natural language consisting of the elements
Each of these different structure elements has its own designated storage area. This means that information objects of all types, attributes of various types or documents of different type are stored within ONE AND THE SAME storage area for objects, attributes, documents and so on… This explains, why it is necessary to include a „type“ element in every data record within each storage area to be able to distinguish „apples from pears“, i.e., persons from companies or a „data value“ as ‚11.12.1998‘ as either being a birth date of a person, or the date of conclusion of a certain insurance contract.
Lastly, before turning to the assessment of the applicable legal framework, it is necessary to introduce the players involved with HessenData. These are:
- The Hessen Police a public authority that uses the system for their analyses both in the realm of crime prosecution and prevention.
- The Hessian Headquarters for Data Processing (HZD), a Hessian service provider for Hessian state authorities . It built special server facilities to ensure safe data processing , and offers „housing services“ to the Hessian Police .
- Palantir is a US system developer providing the analysis platform (or “operating system in its own words) GOTHAM for law enforcement and security agencies. Its German affiliate – Palantir Technologies GmbH – was awarded a contract by the Hessian ministry of Interior which needed (as with every Palantir customer) to be adapted to specific requirements of the Hessian Police and to the information sources which are to be interfaced and analysed by the system. The system resulting in such adaptation was then named “HessenData”.
However, according to the public tender announcement for HessenDATA, Palantir also OPERATES the system since the tender speaks of „Procurement and operation „Beschaffung und Betrieb …“) of an analysis platform for the Hessen Police to effectively combat Islamist terrorism and serious and organized crime“. .
3 General applicability of data protection legal framework: Is personal data being processed in HessenDATA?
This chapter focusses on the impact of HessenDATA on civilian’s personal data protection and privacy. In the last chapter, we’ve analysed the platform and its essential features. Now, to properly assess to which extent personal data protection and privacy are safeguarded, one must assess the general applicability of the data protection framework. However, before I turn to this exercise, it is important to first explain this framework.
3.1 Legislative framework on EU, federal, and state-level
First, data protection legislation is enacted on EU level, since the EU has an explicit legal basis in Article 16 Treaty of the Functioning of the European Union (TFEU). The relevant legislative framework consists of the General Data Protection Regulation (GDPR), and the Law Enforcement Directive (LED). Within the data protection legal framework, the GDPR is the lex generalis for personal data protection, since the „regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than by automated means of personal data which form part of a filing system or are intended to form part of a filing system“ .
The other relevant legislative instrument is the LED, a lex specialis which only applies when personal data is processed by competent authorities „for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security“ . It was acknowledged that „specific rules on the protection of personal data and the free movement of such data in the fields of judicial cooperation in criminal matters and police cooperation based on Article 16 of the Treaty on the Functioning of the European Union may prove necessary because of the specific nature of these fields.“
Second, the GDPR, being a regulation, has direct effect, which means that this legislative instrument does not need to be transposed into Member State law. The LED, on the other hand, is a Directive, which means it has indirect effect and needs to be transposed into Member State law . Furthermore, due to the federal structure of Germany, the States (like Hessen) have their own data protection legislation into which the LED is transposed. In the case of Hessen, this is the Hessian Data Protection and Information Act (HDSIG) . §1HDSIG explains the scope and applicability of the Act, and, inter alia, makes clear that the Act does not apply to the extent that the GDPR is applicable. Hence, the HDSIG and the GDPR are the relevant legislative instruments that need to be examined in order to assess the level of data protection of data subjects that are affected by processing within HessenDATA.
3.2 Relevant provisions within HSOG
Now that we’ve covered the relevant legislative instruments that offer personal data protection, it is also important that we discuss the relevant provisions in Hessian Police Law, that offer a legal basis for the Hessian Police to process personal data in HessenDATA.
The relevant piece of legislation to examine is the ‚Hessian Act on Public Security and -Order” (HSOG)‘ . The first part of this act provides rules on the competences and tasks of law enforcement, like rules on the use of their power, rules on the use of force, and, very important: rules on the use of different types of data they can use. The second part of the act deals with the organisational structure of the police. §25(1) HSOG authorises law enforcement to store personal data of the people listed in §§6, 7 and 13 section 2 Nr. 1. According to §25(2), personal data of other people cannot be stored unless it seems necessary to fulfil their law enforcement tasks. This paragraph, however, does not cover the automated analysis of data, which is made possible by HessenDATA. After the reform of the Act on 23rd August 2018, §25a was introduced, which deals with the „Automated application for data analysis“.
This provision allows law enforcement to analyse personal data through automation, in the prevention of crimes listed in „§100 Abs. 2 der Strafprozessordnung“ , so-called ’serious crimes‘. This, however, covers several very serious crimes such as high treason, murder, rape, possession of child pornography etc., but also less serious crimes like housebreakings (Wohnungseinbruchsdiebstahl), money laundering or subsidy fraud. Therefore, although HessenDATA is mainly promoted in the fight against terrorism, the tool can, and probably is used in the prevention and detection of a wide range of different crimes, ranging in seriousness.
3.3 Applicability data protection law
Before one can determine the applicability of either the GDPR, or HDSIG in the case of HessenDATA, one must consider whether HessenDATA (1) processes (2) personal data. According to Articles 4(3) GDPR and §41(2) HDSIG, „processing“ is „any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means“. Moreover, „organising“, „storage“, and „structuring“ are listed as examples. Since HessenDATA is designed to perform these actions, we can conclude that HessenDATA processes data.
Now, do data elements classify as „personal data“? According to Articles 4(1) GDPR and §41(1) HDSIG, this is the case, if the data is
- any information
- relating to
- an identified or identifiable
- natural person
Hence, in the following paragraphs, these four building blocks are shortly discussed. For a better understanding of the assessment, I will start with the 4th building block.
3.3.1 „Natural person“
Recital 27 of the GDPR clearly states that the regulation does not apply to personal data of deceased persons. Now, HessenDATA is mainly used to prevent terroristic attacks by identifying potential alive suspects , victims, contact persons, or any other relevant individual, and their networks. Hence, the initial requirement is fulfilled.
3.3.2 „Any information“
From the terminology, it is apparent that the concept must be interpreted broadly from every perspective, irrespective of whether it is the nature of the information ; the content of the information; and the format or medium on which the information is contained. Also, it is essential to state that it is irrelevant whether the nature is objective (facts), or subjective (an opinion or interpretation) . Moreover, as the 29WP (the predecessor of the European Data Protection Board) mentions, it is irrelevant whether this assessment is true. After all, it would not make sense to incorporate a “right to rectification” if only true statements could constitute personal data .
Therefore, all information objects in HessenDATA that originate from the different sources (the databases POLAS, CRIME, and ComVor ; traffic data from telecommunications, requested from providers or gathered through phone extracts, and different social media networks) , and every analysis that builds from these information objects, are „any information“.
3.3.3 „Relating to“
This element explains the relationship between an individual in the real world and its representing information object in the database. In principle, „relating to someone“ means that the information is about that person. According to the 29WP, there are three ways in which information can relate to an individual , in terms of:
- content – a piece of information is directly about that person (like a social security number);
- purpose – a piece of information is used with the purpose to, inter alia, evaluate someone; or
- result – the usage of a piece of information is „likely to have an impact on a person’s rights and interests“.
Since the purpose of the processing of the information in HessenDATA is to evaluate the person’s information object, by assessing whether this individual is a potential suspect, victim, contact, or other relevant person, the data processed in HessenDATA surely relates by reason of purpose, to all involved data subjects.
It is, however, important to realise that information can relate in content, purpose and in result to multiple people at the same time, in different ways. Take for example a phone number. If a phone number is linked to a specific person, this phone number is information about that person, and therefore relates in content to that person. However, if this phone number is checked to evaluate who have called this phone number, the purpose of using this piece of information is evaluation. Hence, this piece of information then relates to all people that are evaluated using this information. If the police then determine that particular other people need to be identified because they have contacted this phone number, and they will therefore be under further scrutiny by the police, this piece of information also relates in result to those people, since the result of the use of this information influences these person’s fundamental rights and freedoms. Hence, much of the information that is processed in HessenDATA will relate to different persons at the same time, in different ways.
3.3.4 „Identified or identifiable“
The person to whom the information relates must be identified or identifiable because of that information. If the identity of a person can be established, based on the information that relates to this person, this individual is identified. This is, for instance, the case with a criminal record, or an analysis of an already identified suspect. Now, the suffix „-able“ in „identifiable“ makes clear that the requirement focuses on the possibility of identification. Identifiability can be argued by several factors herewith explained: reasonability test, technological possibilities, identifiers, and purpose of processing.
As is apparent from recital 26 GDPR, recital 21 LED, and explained by the 29WP: „To determine whether a natural person is identifiable, account should be taken of all the means reasonably likely to be used, such as singling out, either by the controller or by another person to identify the natural person directly or indirectly“ . One must take all objective factors into account, like the technological possibilities, the identifiers (bits of information that can lead to possible identification), and the purpose of the processing. If the purpose implies „the identification of the data subject“, it can be assumed that the data controller has the means reasonably likely to be used to identify the data subject .
Now, the Hessen Police has access to many different data sources. These data sources involve many direct identifiers, such as social security numbers, and indirect identifiers such as an IP-address. HessenDATA allows the police to structure and organise all these different identifiers, to constitute a profile. If a potential suspect or other relevant person cannot be linked to such indirect identification objects in the police systems, data acquired from social media can help to overcome this obstacle. Moreover, companies like Clearview AI  offer LEAs such as the Hessen Police the possibility to use Clearview AI’s facial recognition algorithm in combination with their database (which consists allegedly of 3 billion images scraped from the internet) .
Then, the purpose of the profiling operations in HessenDATA, is to profile potential suspects and other relevant persons. If the processed information does not lead to identification of the relevant persons, the profiling operation is not successful. Moreover, one must consider that the identifiability-prerequisite is fulfilled if there is a reasonable possibility to clearly distinguish a certain individual and its representing information object from any other person object within the system. Considering the access to the many data sources to which the police have access to, and technological possibilities of HessenDATA, as well as the ever-increasing technological developments like facial recognition tools, one can conclude that it is reasonably likely that the information that relates to natural persons, processed in HessenDATA, makes these individuals individually distinguishable by their respective pattern of linked objects.
3.3.5 Inferred data
Then, lastly, the recommendations that are produced by HessenDATA for human analysts, through the automated transformation of unstructured information from various sources into a standardized and fixed structure of typed and names objects, relationships between objects, as well as attributes of objects and documents) inferred data. Does the broadness of the notion of personal data mean that these algorithmic recommendations must also be qualified as such? The answer is: Yes! After all, recommendations satisfy the requirement of „any information“, and they „relate“ to the respective natural person the recommendation is about. Moreover, the purpose of the analysis is distinction of a unique individual, and since the Hessen Police can use many direct and indirect information elements linked to the respective information object to succeed in this purpose , it is likely that the reasonability-requirement regarding unique distinction will be met.
In conclusion, the data protection legal framework applies, since HessenDATA processes (1)any information (2)relating to (3)identified or identifiable (4)natural persons.
4 So what is the applicable legal instrument: HDSOG, GDPR, or a combination of both?
In the last chapter, we have provided an overview of the relevant legislative acts and regulations with regards to the processing of personal data in HessenDATA. In this chapter, we determine which legal instrument applies to this processing.
As explained, within the data protection legal framework, the GDPR is the lex generalis for personal data protection, since the „regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than by automated means of personal data which form part of a filing system or are intended to form part of a filing system“ (Article 2(1) GDPR).
However, it follows from §1(1) in conjunction with §1(5) HDSIG, that the HDSIG is the applicable legal instrument regarding the processing of personal data by public authorities of the state, municipalities, and counties, if the GDPR does not apply. According to Article 2(2)(d) GDPR, this is the case when personal data is processed „by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security„.
Now, the Hessian Police is a public authority, and it processes personal data for the purposes listed in Article 2(2)(d) GDPR. Therefore, we can conclude that the HDSIG applies to the processing in HessenDATA.
So that is it? Well, as previously explained, according to the public tender, Palantir is tasked with the OPERATION of the platform. The term „operation“ is undefined and therefore, unclear. If „operating the platform“ also means that they collect personal data to share this with the Hessen Police, this would result in information-sharing between Palantir and the Hessen Police. This would mean that Palantir acts as a [data] controller , since, in such a case, it determines the purposes and means of the processing of personal data (Article 4(7) GDPR). If that is the case, the HDSIG does not apply to that processing, since Palantir is not a public authority. Instead, the GDPR would apply! Although the regulation does not apply to processing for law enforcement purposes as laid down in Article 2(2)(d) GDPR), this is only the case when the processing is done by competent authorities, which Palantir is not.
This situation would be extremely confusing since, in this instance, both the HDSIG would apply to the processing of the Hessian Police, but the GDPR would apply to the processing by Palantir. Since one cannot clearly distinct this processing, however, the two legislative instruments would apply simultaneously.
Now, it needs to be stipulated that, according to Palantir, the company does not extract (collect) and share any data . The company claims that it merely provides software-services on a contractual basis which is used to structure and organize large datasets. If this is indeed the case, there is no information-sharing since only the Hessen Police provides data. That would mean that Palantir merely acts as a processor, on behalf of the Hessen Police. Hence, in such a case, the HDSIG would apply to all processing within HessenDATA (for law enforcement purposes).
Although there is a lack of transparency regarding the „operations“ of Palantir, and what this entails exactly, the statements by Palantir and the Hessen Police, alone, do not provide sufficient certainty that the assessment in this section, is indeed the correct one. Due to the lack of possibilities to verify these statements, I will assume the truthfulness and continue the assessment with the conclusion that the HDSIG is the applicable legislative instrument.
5 The impact of automated processing in HessenDATA on data subjects’, privacy and the protection offered by the data protection legal framework
In this final chapter of the legal analysis, I discuss two potential issues regarding to the data subjects’ rights as laid down in the data protection legal framework.
As mentioned, HessenDATA is used to structure, and then analyze, vast volumes of information to, inter alia, identify potential suspects and their networks, to prevent terroristic attacks. Hence, the role of an individual is not necessarily clear at the begin of an analysis. The individual could be a potential suspect, a contact of one, or someone that is not involved in any way. Through the analyses, the Hessen Police gains a detailed individual pattern about the person. Hence, there is an inherent risk of a conflict with the affected person’s personal data protection and privacy . Whether the conclusion of these analyses is that even more surveillance needs to be conducted, or that the subject is not relevant, this person deserves an adequate level of data protection. So, do the data individual’s rights, laid down in §§ 50 to 56 HDSIG, provide this adequate level of data protection? Let’s start with an analysis of §49 HDSIG, a provision dedicated to profiling and automated processing.
5.1 The protection offered to data subjects against decisions based on automated processing in HessenDATA
§49 HDSIG is the Hessian implantation of Article 11 LED. It states: A decision based solely on automatic processing which is implying an adverse legal consequence for the data subject or which significantly affects it shall be permissible only if it is provided for by law.
Hence, a legal basis is needed. Now, the law that permits such decisions, is the Hessian Law on Public Safety and Order (HSOG), particularly §25a HSOG. It is clear that the automated analysis in HessenDATA is applied to the processing listed in the provisions, and therefore has to comply with the requirements listed in these provisions. However, are these requirements sufficient?
Again, let’s examine the requirements. First, the decision must be solely based on automated processing. On first sight, it may seem that this is not the case with the decisions, taken in HessenDATA, since a human analyst takes decisions based on the algorithm’s recommendations. However, HessenDATA automatically categorises information objects (information objects, attributes, documents) and establishes relevant links/relationships between these information objects, attributes, and documents . If HessenDATA finds a particular relationship/link interesting, AVA alerts he human analyst . This recommendation can be seen as a software’s decision to point to, or highlight a particular interesting relationship, and is therefore, in principle, a decision solely based on automated processing.
Second, the data subject must experience adverse legal effects or be significantly affected by the decision. Although the LED does not clarify when this threshold is surpassed, the WP29 lists increased security measures or surveillance by the competent authorities as an example of an adverse legal effect . Although some scholars claim that internal interim evaluations that are the result of automated processes are not to be considered as adverse legal consequences , they can be considered to significantly affect the data subject if their fundamental rights are ’sufficiently‘ affected, i.e. because the evaluation is not only short-term and has not only trivial effects. However, this is hard to qualify. Although, real-life surveillance means that the decision has significantly affected the data subject, decisions in online surveillance that may lead to the real-life surveillance, could be qualified as not having an adverse legal effect or not significantly affect the individual.
Third, although this requirement is not explicitly listed in §49 HDSIG, contrary to Article 11 LED, there must be appropriate safeguards in place to protect the data subjects’ rights, of which the right to obtain human intervention is the bare minimum. First of all, this seems to be an incorrect transposition of the LED . Although some scholars argue that „these guarantees are provided via German constitutional and administrative law in its design and application“ , the requirement of human intervention is not explicitly listed in the HSOG. However, assuming the requirement is in fact in place, is this sufficient?
According to the 29WP, „in order to be significant, the human intervention must be carried out by someone who has the appropriate authority and capability to change the decision and who will review all the relevant data including the additional elements provided by the data subject“ . Hence, the human analyst must meet a certain standard of authority and capability before this person is eligible to review the automate decisions. If this person is not capable to do so, there is a great chance that this analyst will over-rely on an algorithmic recommendation. Now, it turns out that over-reliance on algorithmic recommendations is not uncommon . In such a case, according to Lynskey, such decisions „should de facto be viewed as automated“ . Therefore, it is important to further examine the requirement of human intervention, to see if this is meaningful and significant.
Unfortunately, there is a lack of systematic training of young police officers in the field of Intelligence Analysis . Annette Brückner stated that „From past professional experiences, I know that there (at the Federal Criminal Police Office/Bureau – Bundeskriminalamt) was, [only] one course on this [special training in the area of Intelligence Analysis], which offered exactly two free spots for every German state, per year“ . Intelligence Analysis trains police officers to think analytically, rather than intuitively. If analysts think intuitively, however, there is a high risk that bias will influence their decisions . And bias influences police officers, even if they do not show explicit stereotypes . Criminological research has shown that law enforcement officers are prone to focus on ethnicity and race . Furthermore, considering that police data is inherently biased , and this data is used as training data for police officers, a feedback-loop is created that enhances that bias even more . Ultimately, it is important to realize that the automated analyses in HessenDATA that are (partially) based on biased data, produce biased recommendations. And, because human analysts are insufficiently trained to think analytically, the chance of over-reliance on the algorithmic recommendations is significantly increased.
In conclusion, although these recommendations can/should be viewed as decisions based solely on automated processing, it is questionable whether they are perceived as significantly affecting the affected data subjects. Moreover, the right to obtain human intervention is not explicitly listed in the HDSIG, seems to lack in the HSOG, and would not even make a genuine impact since this intervention is not meaningful and significant.
5.2 The protection offered to data subjects against inaccurate recommendations produced in HessenDATA
As a follow up question, I now discuss what the data subjects’ options are when the recommendations that are produced, are inaccurate? The principle of data accuracy, requires that the personal data that is processed, is accurate. This principle is laid down in §42(4) HDSIG and the provision states the following:
“Member States shall provide for personal data to be accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay”
The principle is transposed into obligations for the data controller. §59(1) HDSIG requires the controller to „take appropriate technical and organisational measures to ensure a level of protection appropriate to the risk when processing personal data“. §59(3) lists a number of these measures for the controller to take, such as access control, storage control, or reliability.
Apart from obligations, the principle has also been transposed into the data subjects’ right to rectification or erasure and restriction of processing (§53 HDSIG). Recital 47 of the LED further specifies that this this right relates particularly to inaccurate facts, and that erasure is only needed „where the processing of such data infringes the Directive“ . Before a data subject can exercise the right laid down in §53 HDSIG, however, this person must know about the fact that inaccurate data is being processed. Hence, we must examine §§51 and 52 HDSIG.
§51HDSIG, in conjunction with §29 HSOG, grants the data subject the right to obtain general information, such as the existence of the right of rectification and the purpose of the processing. Recital 42 clarifies that this information could simply be provided on the website of the police. Apart from this general notification duty, the data subject has the right to make an access request to obtain confirmation from the controller, whether or not personal data on him or her is being processed, according to §52 HDSIG. If that is the case, he or she must get access to all information listed §52(1). This includes the notification that the data subject has the right to rectification, deletion or restriction of processing.
Now, this all seems like the data subject has sufficient options to exercise their rights, should inaccurate personal data about them be processed. However, as is clear from §51(2), „the controller may postpone, restrict or refrain from notification to the extent and for as long as otherwise the fulfilment of tasks referred to in § 40, the public safety, [or] the Rights or freedoms of third parties would be endangered, or [this notification] would be detrimental to the well-being of the Federation or a State.“ Moreover, §52(2) allows the controller to refrain from complying with a data subject’s access request completely under these same conditions.
Although the WP29 argues that such a measure „should be applied as the exception rather than the rule“ , and that „blanket exemptions should not be used“ , does the data subject have a realistic chance of exercising their right to rectification, erasure or restriction of processing when inaccurate personal data about them is processed in HessenDATA?
As explained, the HDISG provides several provisions to secure the data subject against negative implications of inaccurate personal data. The controller has the obligation to implement technical and organisational measures, such as access control, storage control, or reliability. However, algorithmic errors can still occur, for example due to an unforeseen error in the software. Performance issues can arise through „large, complex software implementation(s)“ . However, it can also simply be that this inaccurateness was already embedded in the data that was acquired. Nonetheless, it can certainly be the case that an algorithmic recommendation can be inaccurate.
The existence of such errors in HessenDATA are not hypothetical, since such errors occurred in GOTHAM (which is the same as HessenDATA). False associations between innocent individuals and suspects were made, as is apparent from a DPIA of Calgary Police Department . Moreover, Europol experienced the issue that suspects, and innocent persons were mixed up due to „performance issues“ . Hence, although the controller has these specific obligations, mistakes happen, and this negatively impacts the data subject’s personal data protection.
So, what about their rights, provided for in §§50 to 52 HDSIG? Do these provide sufficient protection against negative implications of inaccurate recommendations? In theory: Yes! In practice: highly questionable.
First, the recommendation may not be tested if it is based on data (facts) that have been tested for accuracy. Second, the data subject has the right to have inaccurate data rectified, but first needs to know that his or her data is processed in HessenDATA. As explained, data subjects have the right that inaccurate personal data relating to them is rectified, „in particular where it relates to facts“. However, inferred data, are not regarded as facts, since they are „probabilities or estimates, which have different levels of reliability than facts“ . Moreover, the Hessen Police can significantly restrict or omit the right to obtain information, „for as long as otherwise the fulfilment of tasks referred to in §40, the public safety, [or] the Rights or freedoms of third parties would be endangered, or [this notification] would be detrimental to the well-being of the Federation or a State.“
Considering the fact that this data subject will, at the moment of processing, still be evaluated for their role as a potential suspect or contact in a terrorist network, it is highly unlikely that the Hessen Police will provide information to the data subject. First, the human analyst might still trust the algorithmic recommendation (perhaps due to over-reliance), and not contest its validity. Second, since the analysis is conducted to identify potential terrorists and contacts, providing information to this person could prejudice the prevention of criminal offences and harm public or national security. Lastly, such a measure would likely to be considered proportionate and necessary in a democratic society, due to the far-reaching powers of law enforcement in the fight against Islamic terrorism.
Now, if the data subject has no knowledge that their personal data is automatically processed by HessenDATA, and has led to an inaccurate recommendation, how would this person know that they can exercise their right to rectification?
Hence, the data subject, affected by HessenDATA, has, in theory, the right to have inaccurate algorithmic recommendations rectified. However, the data subjects’ rights in §§50 and 51 HDISG, to obtain information, access, and rectification or erasure and restriction of processing, can be omitted for the purposes of preventing a criminal offence and public security. Although an inaccurate algorithmic recommendation can severely impact the data subjects’ rights, there is a significant chance that they will not be able to have this inaccurate recommendation rectified, erased, or processed restrictively.
In this article, I’ve reviewed the applicability of the data protection legal framework on the processing within HessenDATA. Since personal data is being processed in this tool, and the controller seems to (only) be the Hessian Police, the HDSIG is the relevant legal instrument to assess. This instrument, unfortunately, has incorporated too many exemptions for the Hessian Police, to restrict or omit the data subject’s right to data protection. As we’ve assessed, the specific provision in the HDSIG, dedicated to HessenDATA’s processing, offers little concrete protection. The absence of an explicit reference to the requirement of human intervention is only an example of this.
However, were this requirement incorporated, it is still doubtful, whether this offers any real protection to affected data subjects due to the lack of training of analytical thinking within the Hessian Police. Furthermore, it is virtually impossible for a data subject to have inaccurate personal data about them rectified, due to all of the grounds for exemption on which the Hessian Police can rely on. Since providing information could hamper the investigation process, and this process is never really finished, data subjects will likely not know that their data is being processed, let alone the fact that they can have this data rectified if it is inaccurate.
Ultimately, this leads me to conclude that individuals’ personal data is NOT sufficiently protected against the processing (powers) within HessenDATA, and all of the negative consequences this can have.
About me / Über mich
Mein Name ist Giel Ritzen. Ich bin Jurist und beschäftige mich hauptsächlich mit dem Datenschutz und dem Recht auf Privatsphäre. Meine Leidenschaft für diese Themen begann vor einigen Jahren, nachdem ich einen Dokumentarfilm gesehen und einige Bücher gelesen hatte (Je hebt wel iets te verbergen [Sie haben etwas zu verbergen] von Maurits Martijn und Dimitri Tokmetzis, und (natürlich) Surveillance Capitalism von Shoshana Zuboff). Nachdem mir bewusst wurde, wie sehr die Grundrechte des Datenschutzes und der Privatsphäre eingeschränkt werden, nur damit mehr Geld verdient werden kann oder um der „öffentlichen Sicherheit“ willen (das ultimative Schlagwort seit dem 11. September), wurde in mir Aktivismus ausgelöst. Dies führte schließlich dazu, dass ich den Masterstudiengang Recht und Technologie in Europa an der Universität Utrecht studierte.
Nachdem ich im Sommer 2021 meinen Master abgeschlossen habe, arbeite ich seit Oktober als Praktikantin bei noyb (none of your business). Für diejenigen, die noyb nicht kennen: Dies ist eine NGO, die den Datenschutz und den Schutz der Privatsphäre durch gezielte und strategische Rechtsstreitigkeiten fördert. Obwohl ich mich mit verschiedenen europäischen Datenschutzgesetzen befasse, dreht sich meine Arbeit hauptsächlich um die DSGVO. Neben dem Verfassen von Zusammenfassungen und Kommentaren für den GDPRhub (https://gdprhub.eu), bereite ich mögliche Verfahren durch juristische Recherchen und das Verfassen von Analysen und Beschwerden vor.
Eines der Themen, die mich am meisten interessieren, ist die algorithmische Entscheidungsfindung. Obwohl die Technologie an sich kein Problem darstellt, werden häufig große Mengen an (personenbezogenen) Daten benötigt. Aus diesem Grund besteht ein natürliches Spannungsverhältnis zu einigen Grundrechten, wie z. B. dem Recht auf den Schutz personenbezogener Daten und dem Recht auf Privatsphäre. Ich bin kein Verfechter des Techno-Solutionismus, denn zu viel Vertrauen in die Fähigkeit von Algorithmen hat bereits dazu geführt, dass Menschen aufgrund ihrer Hautfarbe, ihrer politischen Präferenz, ihres Geschlechts usw. diskriminiert werden (Man denke nur an die Benefiz-Affäre in den Niederlanden oder den Einsatz verschiedener prädiktiver Polizeialgorithmen, wie HessenDATA).
Durch meine derzeitige Arbeit ist mir sehr bewusst geworden, wie sehr es notwendig ist, dass sich die Menschen weiterhin für den Datenschutz und den Schutz der Privatsphäre einsetzen. Obwohl Europa die besten Datenschutzgesetze der Welt hat, bestehen einige Rechte nur auf dem Papier. Ja, als Bürger haben Sie Rechte in Bezug auf den Schutz Ihrer Daten. An der Durchsetzung dieser Rechte mangelt es jedoch auf allen Seiten. Die Aufsichtsbehörden sind unterfinanziert, können die enorme Arbeitsbelastung nicht bewältigen und werden in ihrer Arbeit von privaten Parteien und anderen staatlichen Stellen behindert. Deswegen brauchen wir mehr Aktivismus und mehr Bewusstsein. Ich werde mich weiterhin für eine bessere Welt einsetzen, in der die Bürgerrechte nicht nur auf dem Papier, sondern auch in der Praxis bestehen. Ich hoffe, Sie auch.
 Michael Schaich, Innenminister eröffnet INNOVATION HUB 110 der hessischen Polizei in Frankfurt, 13. 08.2020, last accessed on 26 April 2021.
 Hessendata: Daten Zusammenführen aus sozialen Medien mit Polizeidatenbanken?!, last accessed 23.08.2021
 For a detailed description on Palantir, please visit https://police-it.net/category/akteure/it_anbieter/palantir
 Palantir Gotham: Integrate, manage, secure, and analyze all of your enterprise data, accessed 27.04.2021
 Palantir Gotham: Systembeschreibung und Funktionsweise, 29.11.2018, last accessed 26.04.2021
 Nadkarni et al., Organisation of Heterogeneous Scientific Data Using the EAV/CR Representation, (1999) 6 J Am Med Inform Assoc. 478-479.
 Palantir Gotham: Systembeschreibung und Funktionsweise’. 29.11.2018, last accessed 26.04.2021
 Effiziente IT für die Verwaltung, Hessische Zentrale für Datenverarbeitung
 Committee of Inquiry (n 50) 65-66.
 ibid 69-70.
 Analysis platform for the effective fight against Islamist terrorism and serious and organized crime’ (Notice reference number: 12-7o50 Hu 12-02716 / 2017): https://ausschreibungen-deutschland.de/418472_Analyseplattform_zur_effektiven_Bekaempfung_des_islamistischen_Terrorismus_und_der_schweren_und_2018_Wiesbaden, last accessed 28.06.2021.
 Article 2(1) GDPR.
 Article 2(2)(d) GDPR.
 It is important to mention that Germany has been reprimanded by the EC because not all German States transposed the LED into State legislation within the transposition time. See https://eucrim.eu/news/infringement-proceedings-not-having-transposed-eu-data-protection-directive/
 Hessisches Datenschutz- und Informationsfreiheitsgesetz https://www.lexsoft.de/cgi-bin/lexsoft/justizportal_nrw.cgi?templateID=document&xid=8074311,1
 Hessisches Gesetz über die öfftentliche Sicherheit und Ordnung (2004)
 §100a StPO Strafprozessordnung (StPO) § 100a StPO Strafprozessordnung (StPO)
 See   ibid 10.
 Article 29 Working Party (n 74) 15.
 ibid 16.
 Clearview AI: searching our communities, last accessed 04.06.2021
 Clearview AI’s biometric photo database deemed illegal in the EU, but only partial deletion ordered (NOYB 28 January 2021) last accessed 4 June 2021.
 See 3.3.4
 Letter from Courtney Bowman Director of Privacy & Civil Liberties, Palantir Technologies, to Rasha Abdul-Rahim, Amnesty International (18.09.2020 annex 1.
 Broeders et al.  314.
 See section 2.1.
 Also see: https://www.palantir.com/palantir-gotham/titan/index.html under “Palantir AVA”
 Article 29 Working Party, ‘Opinion on some key issues of the Law Enforcement Directive (EU 2016/680) (WP 258, 2911.2019 12.
 Löber, in Roßnagel, Hessisches Datenschutz- und InformationsfreiheitsG, §49, para 12 (Nomos).
 Schwichtenberg, in Kühling/Buchner, DS-GVO BDSG, BDSG §54, para 6 (C.H. Beck).
 Frenzel, in Paal/Pauly, DS-GVO BDSG, BDSG §54, para 6 (C.H. Beck).
 ibid 13.
 Oral evidence given by Chief Constable Michael Barton of Durham Constabulary to The Law Society Commission on the Use of Algorithms in the Justice System, in their report ‘Algorithms in the Criminal Justice System’ (2019) 20.
 Lynskey (n 36) 174
 Palantir-Dossier: IT der Sicherheitsbehörden – US-Anbieter auf dem Vormarsch – Teil 7, RT.DE, 09.12.2018, last accessed 21.6.2021
 Interview with Annette Brückner
 Was ist eigentlich Intelligence Analysis? referencing Clark (2013); Pherson & Heuer Jr. (2010).
 Lois James, ‘The Stability of Implicit Racial Bias in Police Officers’, (2018) 21(1) Police Quarterly 31, referencing Dasgupta (2013); L. James, Klinger, & Vila (2014).
 Kristian Lum, William Isaac, ‘To predict and serve?’ (2016) 13(5) Significance 15-16.
 ibid 16.
 In this Article, I will focus solely on the possibilities of rectification, and not on erasure or restriction of processing.
 ibid 20.
 Daniel Howden, Apostolis Fotiadis, Ludek Stavinoha, Ben Holst, Seeing stones: pandemic reveals Palantir’s troubling reach in Europe, The Guardian (London) 02.04.2021, accessed 14 June 2021.
 Kate Robertson, Cynthia Khoo, and Yolanda Song, To Surveil and Predict: A Human Rights Analysis of Algorithmic Policing in Canada (2020) Citizen Lab and International Human Rights Program, University of Toronto, 48.
 See (n 123).
 Mark Leiser and Bart Custers (n 39) 377.
Copyright und Nutzungsrechte
(C) 2022 CIVES Redaktionsbüro GmbH
Sämtliche Urheber- und Nutzungsrechte an diesem Artikel liegen bei der CIVES Redaktionsbüro GmbH bzw. bei dem bzw. den namentlich benannten Autor(en). Links von anderen Seiten auf diesen Artikel, sowie die Übernahme des Titels und eines kurzen Textanreißers auf andere Seiten sind zulässig, unter der Voraussetzung der korrekten Angabe der Quelle und des/der Namen des bzw. der Autoren. Eine vollständige Übernahme dieses Artikels auf andere Seiten bzw. in andere Publikationen, sowie jegliche Bearbeitung und Veröffentlichung des so bearbeiteten Textes ohne unsere vorherige schriftliche Zustimmung ist dagegen ausdrücklich untersagt.