Tax compliance and privacy rights in profiling and automated decision making

Luisa Scarcella, Department of Tax and Fiscal Law, University of Graz, Austria, luisa.scarcella@uni-graz.at

PUBLISHED ON: 22 Oct 2019 DOI: 10.14763/2019.4.1422

Abstract

New technologies allow tax authorities to carry out faster and automated analysis of large amounts of data, minimising errors and saving time. Some of these technologies enable tax administrations to identify and cluster taxpayers based on the risk of noncompliance. Consequently, “high risk” taxpayers will be audited. The European Union General Data Protection Regulation (GDPR) has introduced new provisions on automated decision making and how individuals can be profiled - technology, such as the one implemented by tax administrations, could present difficulties in this area. Even if profiling and automated decision making in tax matters are included in the broader public interest exception, safeguards to taxpayers’ privacy rights need to be in place.
Citation & publishing information
Received: March 31, 2019 Reviewed: June 18, 2019 Published: October 22, 2019
Licence: Creative Commons Attribution 3.0 Germany
Competing interests: The author has declared that no competing interests exist that have influenced the text.
Keywords: Taxation, Profiling, Automated decision-making, GDPR
Citation: Scarcella, L. (2019). Tax compliance and privacy rights in profiling and automated decision making. Internet Policy Review, 8(4). https://doi.org/10.14763/2019.4.1422

Introduction

The use of information technology is vital for the effective administration of tax systems and in recent years, tax administrations around the world have increasingly invested in information technology tools (OECD, 2016a; OECD, 2016b; OECD, 2019). Due to the high number of taxpayers that need to be effectively and efficiently assessed, the support offered by new technologies has represented an opportunity for tax administration. At the same time, while the digital economy imposes new challenges to tax authorities and efficient tax law enforcement (OECD, 2015), the evolution of the digital world including new cross-border business practices required revenue administrations to keep pace with new technologies themselves (Ehrke-Rabel, 2019a). Indeed, the complexity of interactions and transactions taking place at taxpayers’ level requires the processing of an increasing number of information (OECD, 2016a; OECD, 2016b; Ehrke-Rabel, 2019a). Thus, the growth in Big Data and electronic financial transactions presents opportunities for tax authorities to use this data to collect taxes more efficiently and to be very popular with governments to clamp down on tax avoidance/evasion.

In order to effectively manage a tax system, together with the use of new IT systems, an important role is played by data. Even in traditional reporting systems, the tax agency has always been relying on the high volume of information provided by the taxpayers and third parties. However, thanks to new technologies, it is much easier for tax administrations to gather and process such data. Governments have been creating and accessing a very high number of taxpayers’ data from different sources. These sources include public records, information gathered by other authorities (whether domestic or not), businesses (Jensen & Wöhlbier, 2012) and other third parties such as employers and banking or financial institutions (Ehrke-Rabel, 2019a; Ehrke-Rabel, 2019b). Data are the fuel of technological tools implemented by tax administrations and they can be used in tax collection, monitoring and for supporting auditing decisions. Indeed, by receiving and processing more data, tax administrations will be able to reduce information asymmetries which represent a threat to equal and complete tax collection (Ehrke-Rabel, 2019a; Doran, 2009; Lederman, 2010; OECD, 2017). Furthermore, the data gathered in this way can facilitate economic and policy design (OECD, 2016a).

On one hand, the increasing role of technology brings a lot of advantages to the tax system allowing for faster and more automated analysing of the high level of data, minimising the errors and saving time. However, on the other hand, the use of new technologies processing of many taxpayers’ data, including personal data, also brings uncertainty in relation to the level of automation that can be used without breaching privacy rights. For example, by processing this large amount of data, tax administrations can cluster taxpayers based on their profile to monitor and decide which taxpayers shall be audited. Finally, this automatic profiling of taxpayers could ultimately lead to the automation of decisions which might affect the taxpayers. Moreover, concerns might arise in regard to who should provide the IT system, whether it is built by the same agency or it is outsourced.

The General Data Protection Regulation (GDPR) has introduced new provisions on how individuals can be profiled and technology that enables automated profiling, such as the ones used by tax administration for risk management and advanced data analytics (OECD, 2016), has the potential to present difficulties in this area. At the same time the use of these technologies in the tax field is justified by the need to safeguard the public interest (Art. 6 (3), Art. 9 (2) (g), Art. 23 (1) (e) GDPR) and advocating for transparency of the tools at the disposal of tax authorities could represent a risk for tax authorities.

This paper aims at highlighting one of the current issues related to finding the right balance between tax compliance and privacy rights. The main focus of the article is to present the relations which arise between the Information and communications technology (ICT) tools enabling profiling and automated decision making in tax matters and the GDPR provisions, which apply to the protection of privacy rights in this context. Moreover, this contribution aims at describing the policy implications of such interactions between GDPR provisions and automated decision making carried out by tax authorities.

This article consists of four sections. The first one contains an overview of the ICT tools used by tax administrations when carrying out their activities (e.g., tax monitoring, auditing, collection). The second section analyses how these ICT tools might perform profiling and automated decision making pursuant to the GDPR definition of these two key notions. The third section highlights how in the context of the GDPR, the European legislator has tried to balance tax compliance needs with individual privacy rights. Finally, section four describes the policy implications for EU member states deriving from the relation between GDPR provisions regulating profiling and automated decision making, and the instruments at tax authorities´ disposal in the fight against tax evasion and fraud.

Section 1: ICT tools used by tax administrations

As recent studies conducted by the Organisation for Economic Co-operation and Development (OECD) and the Intra-European Organisation of Tax Administrations (IOTA) show, tax administrations around the world have integrated new technologies to improve their tax collection mechanisms (OECD, 2016a; OECD, 2016b; OECD, 2017; IOTA, 2018). Indeed, and more generally, revenue agencies need technologies in order to address transparency of operations, greater efficiency and responsiveness to the needs of government and taxpayers. The implementation of new technologies by tax administrations varies around the world and from developing to developed countries (Kariuki, 2014). The need for IT is also reflected in the budget of tax administrations and requires careful management (OECD, 2016a; OECD, 2016b). According to previous studies, 159 out of 193 UN member states use ICT-intensive systems for tax management (Tomar, Guicheney, Kyarisiima, & Zimani, 2016; World Bank, 2016).

In the last two decades, there have been different ways in which tax administrations have used ICT to enhance performance in revenue administrations, some of them include: to provide readily accessible historical data; to reduce mistakes, processing times and costs; to improve and promote voluntary compliance and consequently increase revenue collections (Smith,1969; Edwards-Dowe,2008; Chatama, 2013; Kariuki, 2014). Some administrations will use new technologies just in order to perform their core and basic tasks such as: registration, processing, payment and accounting, audit targeting and debt collection (OECD, 2016a; OECD, 2016b; IOTA, 2018). Recent examples of ICT implementations in tax matters can be found in Slovenia where certified electronic cash registers are connected to the tax administrations which are informed about transactions in real-time; also, in Chile or Italy where they have adopted an Electronic Invoicing System which directly connects the taxpayers and the tax administrations (IOTA, 2018). However, more broadly, examples of possible ICT tools used by tax authorities typically include: e-filing of tax returns, e-payments, data sharing and datamatching, taxpayer self-help portals, chatbots for technical enquiries (IOTA, 2018). These instruments rely on automated data matching, precedent databases, campaign management and rules-based systems. Data matching is fueled by the information that was gathered through several records and includes third party information as well. This information is typically used to assess the information which was provided by the taxpayer and a database informs the formulation of tax rulings. Finally, these systems are based on the data they are fueled with might be enabled to decide what actions should be taken, such as sending communication to the taxpayer about their tax situation (Kariuki, 2014).

In a recent study, the OECD has highlighted the benefits of ICT used for tax and a lot of attention has been given to the role of Big Data and advanced analytics techniques for tax administrations (OECD, 2016a). Referring to the collection of Big Data from third-party sources which could be then combined with tax data, the OECD underlines how this will allow revenue bodies to develop and create tailored e-services that target the specific needs of individual and business taxpayers (OECD, 2016a). With reference to Big Data, they could improve the ways in which revenue bodies examine and understand the activities and taxpayers’ behaviour through several implementations of Big Data for information storage, the analysis across multiple periods, compliance, control and risk management activities, identifying and tracking changes in taxpayer abilities and performance to enable revenue bodies to respond more effectively and in a timelier manner and for supporting whole-of-government outcomes by sharing insights and information (Ehrke-Rabel, 2019a; IOTA, 2018).

Regarding advanced analytics techniques, a 2016 OECD survey showed that advanced analytics is the principal application for audit case selection (OECD, 2016a). Moreover, 15 out of the 16 tax administrations that answered the OECD survey indicated that they were deploying advanced analytics to prioritise cases for investigation, audit or other compliance intervention (OECD, 2016a). According to the same OECD study, administrations generally create unsupervised models which consist of models seeking to identify interesting or anomalous patterns in the data, rather than trying to learn from the outcomes of specific cases. Moreover, tax administrations such as the Irish and the Dutch ones have experimented with unsupervised segmentation techniques. These techniques represent a sectorial application of the broader cluster analysis through which it is possible to identify groups of taxpayers who are similar to each other in some significant respects, and dissimilar to the other groups identified (OECD, 2016a). Ireland has also adopted an alternative approach to segmentation, which focuses on grouping taxpayers based largely on their predicted response-to-intervention. According to this model if all taxpayers have the same response to a given intervention, then there is little practical value in segmentation, whereas if there are large and consistent differences in response-to-intervention, then segmentation is worthwhile. This approach is based on the uplift modelling techniques which is likely to create multiple segmentations and ultimately, each type of intervention would require a different segmentation of the taxpayer base (OECD, 2016a).

Two examples of unsupervised models can be found also in the Australian nearest neighbours model, which is able to identify incorrect income tax deductions, and in the Irish income-consumption model, aiming at the identification of under-declaration of income (OECD, 2016a). What is a common element in both models, even though they use different statistical techniques (k in the case of Australia’s nearest neighbours model and multiple regression for Ireland’s income consumption model), consists of comparing a taxpayer’s return to those of his or her peers. In this way, it is possible to identify outliers for further investigation, and also to identify cases which, even though they may appear unusual on initial inspection, are in fact normal once compared to others, similar cases (OECD, 2016a). Other examples of implementation of advanced analytics are the Swedish predictive model to specifically identify unreported income, as distinct from over-claiming of deductions and the US structured income flows model which links the analysis of related entities to uncover misreporting at the entity-level and non-compliance associated with the structure of income flows (OECD, 2016a).

In the 2016 OECD survey, it also emerges how tax administrations are using both predictive and prescriptive techniques. The first ones aim at identifying taxpayers who are more likely to fail to meet their obligations, while the second ones are implemented to verify which is the most effective way to communicate to a certain group of taxpayers. Regarding predictive techniques, tax administrations from countries such as Australia, Canada, Norway and the United Kingdom have implemented programmes for risk modelling and controlled experimentation that identify which cases are likely to fail to meet payment or filing obligations, and which interventions are likely to remedy the problem. In these cases, analytic outputs are used both to prioritise cases and to determine treatment paths. For example, the United Kingdom has built models that are able to assess taxpayer risk prior to filing (e.g., determining which taxpayers are most likely to miss filing deadlines) in order to target interventions to encourage compliance (OECD, 2016a).

An example of prescriptive-analytics technique is the so-called experimental design where treatment and control groups are partitioned and observed in order to isolate the effects of specific actions, interventions, or treatments. This instrument is particularly used for direct taxpayer communications and the Norwegian administration, for example, has engaged with a behavioural economics researcher to test a variety of communications intended to improve compliance on declarations of foreign income (OECD, 2016a).

Relevant for the scope of this analysis is, in particular, the use of technology for tax auditing risk assessment. In this profiling modality, it should not be possible to single out individuals by name or identifying characteristics. However, it is quite problematic in determining where the collected information and the technological system are effectively singling out taxpayers. This could be the case when a process adds extra value to taxpayers of a certain postal code, gender, birth month (Ohm, 2010, as cited by Kroll et al., 2016). The auditing risk assessment is usually conducted by also checking the tax returns that were previously filled (Kroll et al., 2016).

Section 2: How the GDPR notions of profiling and automated decision making fit in the use of ICT tools by tax administrations

In the context of this paper, we focus on two concepts which are relevant in the way tax agencies are using ICT tools and which are both contained in the GDPR, namely profiling and automated decision making. While in the academic discourse, the tendency is to focus on the commercial applications of these techniques to better segment markets and tailor services and products to align them with individual needs, profiling and automated decision making can and are implemented also in the public sector (e.g., in education, healthcare and transportation). Indeed, both the private and public sector, profiling and automated decision-making can increase the efficiency of delivering a certain service. However, the use of these techniques may raise significant risks for individuals’ rights and freedoms.

As we have seen, in the previous sections, tax authorities are implementing new technologies for different reasons (e.g., better tax assessment and collection, better communication with the taxpayers, increasing tax compliance ex ante). In many of the examples reported there is a clustering of taxpayers based on the different purpose pursued by the tax administration.

Considering the personal income tax, new technologies clustering taxpayers based on the information contained in their tax returns and received by third parties can be a very useful tool in verifying whether the income declared by that natural person is correct or not. The way in which personal income tax is generally built, it relies on different income categories (e.g., business income, employment income, capital income), tax exemptions and the possibility to deduct expenses. This type of construction makes it possible to consider it as a progressive tax and be compliant with the ability to pay in principle.

Traditionally, in order to minimise the interference with taxpayers’ personal autonomy, tax collection has been based on the information provided by the taxpayers through the submission of her/his tax return (Ehrke-Rabel, 2019a). The tax return is the instrument through which natural persons declare the income they have produced during the previous fiscal year.1 Depending on the threshold under which taxpayers´ income will fall, taxes will be due according to a certain applicable tax rate. Once the tax return is submitted, the tax authority will proceed to the verification and assessment of the due taxes. Because of the high number of tax returns submitted to tax authorities which basically consist in a mass procedure, for a long time it has been assumed that tax authorities would not be able to thoroughly verify all returns before assessment. Consequently, initial assessments were (and still are) regularly subject to revision through tax audits (Ehrke-Rabel, 2019a; Vaillancourt et al., 2011; Russell, 2010; Jensen & Wöhlbier, 2012; EU Commission, 2006; OECD, 2006; OECD, 2017).

Moreover, maintaining a progressive system while at the same time avoiding revenue losses, created a complex system for both tax administrations and taxpayers. This has led to the introduction of pre-filled tax returns and the creation of online applications to calculate the due amount of taxes. By matching the submitted tax returns with other information which were gathered by other public administrations or third parties (e.g., employer, financial institutions, etc.), tax administrations are able to verify whether the declared income is correct or not. Indeed, a pivotal role in the good functioning of the tax auditing system is played by data transmitted to tax authorities by third parties.2 However, matching these data through ICT tools could lead to profiling of taxpayers and consequently to automated decision making pursuant to the GDPR definitions.

2.1 Profiling performed by tax authorities

As defined by the GDPR, profiling can be described as any form of automated processing of personal data aiming at the evaluation of certain personal aspects of a natural person. Among these aspects, the European legislator lists the natural person´s performance at work, economic situation, health, personal preferences, interests, behaviour, location or movements (Art. 4(4) GDPR).

From this definition, in order to verify whether profiling can take place in the tax sphere there are three elements which need to be present in the way tax administrations use the ICT tools at their disposal and in the way these tools are built:

  1. The processing must be automated;
  2. It must be carried out on personal data of a natural person;
  3. The processing scope is the evaluation of the personal aspects of a natural person.

As described above, the increasing number of possible deductions, the different types of income that taxpayers can produce simultaneously, and the high number of taxpayers itself makes it impossible for tax administrations to go through each tax return. The use of employees for checking each tax return would be too expensive for tax administrations (Ehrke-Rabel, 2019c; Lipniewicz, 2017) and would drive away resources which could be used for other public activities.

This has led to the adoption of automated systems which are able to go through a large amount of data and verify whether the information submitted by the taxpayers are correct or not. In this sense, the processing of the gathered taxpayers’ data is automated and thus, fulfills one of the GDPR requirements for the processing of data to be considered as profiling.

Another aspect which needs to be considered is whether the taxpayers’ data collected and processed by the tax administrations are personal data. Indeed, the information at disposal of the tax administrations in order to verify the income of a certain taxpayer relates to an identified or identifiable natural person who (as exactly stated in the definition of personal data of the GDPR) “can be identified, directly or indirectly, by reference to identifiers such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person”.

Finally, in order to prove whether the use of ICT by tax administrations in the management of the tax returns and the consequent verification of the correctness of the declared income might consist in a profiling activity, the scope of the processing must be the evaluation of personal aspects about a natural person. Among the example of personal aspects cited in Art. 4 (4), which defines the notion of profiling, there is the economic situation of the natural person which is at the heart of the evaluation of whether the declared income is correct or not and in order to verify the correctness all the directly and indirectly, relevant economic and non-economic elements will be taken into consideration. Indeed, among these elements, there are financial accounts, expenses such as cars or immovable properties (and in this case the exact location and structural elements which intrinsically influence the price and value) but also medical, cultural or educational expenses.

One last aspect concerning profiling which needs to be considered is the possibility to carry out group profiling. This type of profiling is based on data from existing groups, but it can also involve categorisation based on aspects shared by group members without them realising that they belong to that particular group (Mantelero, 2016). In the tax sector, risk management tools might divide taxpayers into groups with different risk levels based on different sets of data. It has been noticed that in this type of profiling, there is a significant number of false positives (deciding that a person is a member of the group when they are not) or false negatives (deciding that a person is not a member of the group when they actually are) (Kamarinou, Millard, & Singh, 2017). Moreover, the presence of false positives and false negatives can lead to decisions which produce legal or significant effects on individual people. Consequently Art. 22 GDPR might be applicable since it requires that the decision based on the profiling addresses an individual and has legal or significant effects for him/her (Kamarinou et al., 2017).

2.2 Automated decision making in tax matters

As will be further investigated in this section, profiling might also lead to decisions based on the processed data which can be automated and consequently, both individual profiling and group profiling might lead to the application of Art. 22 GDPR. With regard to automated decision making under the GDPR, there are two aspects, which need to be further analysed especially in connection to their implications in tax matters. First of all, it is important to understand the scope of the word “decision”. Second of all, it is important to identify the cases where the decision is “solely” automated.

In tax matters, the use of software able to go through the amount of data collected by tax authorities in relation to tax returns and information provided by third parties will lead to the identification of possible mismatches between what has been declared by the taxpayer and what results from the combination of all the information available to the tax authorities. Consequently, a tax assessment notice indicating a different amount of tax to be paid and relevant sanctions (in the case where more taxes are due than what has been paid by the taxpayer) will be sent to the taxpayer. Depending on the different procedural rules of a single member state, the taxpayer will be given a certain amount of time to challenge the tax assessment notice. This means that the tax assessment notice which is based on the results of the software which match the different information available to the tax authorities is not a final decision and neither is a court decision.

The meaning of the word “decision” in the context of automated decision making can be derived by looking at the different parts of the GDPR text. It has already been highlighted that Art. 22 GDPR does not specify whether the decision mentioned in the article has to be a final decision or just a mere interim or individual step taken during the automated processing. However, recital 71 of the GDPR expressly states that the word “decision” should include also “measure”. Thus, the word “decision” is to be understood in a broader sense. At the same time, Art. 22 of the GDPR describes the word “decision” as the one which produces legal effects or similarly significantly affects the data subject. On the one hand, with regard to the “legal” element, this requires that the decision be binding or that the decision create legal obligations for the data subject. In the case of the tax notice, where the taxpayer does not challenge it, if he/she does not comply with it, the tax assessment notice can be enforced by the relevant authorities. On the other hand, the fact that the GDPR introduces the word “similarly”, absent in the previous directive, to the phrase “significantly affects” means that the threshold for significance must be similar to that of a decision producing a legal effect. Even if it can be argued that the “significant” element is rather vague, Article 29 Working Party has identified possible categories of decisions which can be considered as producing “similarly significantly” effects on the data subjects (Veale & Edwards, 2018). These decision categories include decisions affecting someone’s access to health services, to education, decisions denying someone an employment opportunity or put them at a serious disadvantage and decisions affecting someone’s financial circumstances. Undoubtedly, tax assessment notices affect the financial circumstances of the data subject (Art. 29 Working Party, 2017).

The second aspect that needs to be considered in order to identify a solely automated decision, is the level of human intervention. Art. 22 of the GDPR finds application only in cases where decisions are made in a “solely” automated way and the scope of the word “solely” is decisive in the determination of the practical extent of the rights granted to data subjects (Bygrave, 2001; Wachter et al., 2017; Veale & Edwards, 2018). In order to frame the scope of the notion of “solely” the attention needs to be focused on the level of human intervention in the loop. Indeed, it is difficult to find completely automated systems where the decisions are made “solely” by the algorithm (Veale & Edwards, 2018). Consequently, a literal interpretation of the word “solely” will significantly reduce the practical scope of application of Article 22 and it might even lead to a wider introduction of a nominal human intervention in the loop consisting in a mere “rubber-stamping” in order to limit the application of Article 22 (Veale & Edwards, 2018). Under Art. 29 Working Party (2017), the activity leading to the decision should not be a tokenised gesture but there must be an influential activity exercised by a human. The main issue in the context of this contribution is whether the mere signature by the tax agent responsible for the assessment procedure reported on the assessment notice, completely based on the ICT system used and to be sent to the taxpayer, can be considered a sufficient indication of human intervention. Depending on a case-by-case analysis, it might be that the tax agent had to go through further investigations before finalising and sending the assessment notice. Nevertheless, the outcome on which the assessment letter is based resulting from the implementation of an ICT system will be hardly questioned by the tax agent. In fact, there are studies showing that even in systems where the explicit intention is to merely support a human decision-maker, the trustworthiness of the intrinsic automated logic of the system, the lack of time and convenience reasons, tend to make the system operate as wholly automated (Skitka, 2000, as cited by Veale & Edwards, 2018). The difficulties in interpreting the level of human intervention emerge in particular by looking at national experiences. For example, the German Federal Court has adopted the restrictive interpretation and considered any minimum human intervention as excluding the applicability of the old Art. 15 of the Data Protection Directive.3 Differently, according to the opposite interpretation of the UK data protection authority (ICO), if an irrelevant human intervention has been involved, Art. 22 should be applicable (Information Commissioner’s Office, 2017). From a scholarly perspective, there are different opinions. Some scholars opted for the interpretation precluding the application of Art. 22 to any decision-making process where even a minimal intervention is involved (Martini, 2017, as cited by Malgieri & Comandé 2017). Differently, Malgieri and Comandé (2017) sustain that limiting the applications to these cases can be compared to “a rubber-stamping on the automated processing, easily performed even by a monkey or another trained animal”. Similarly, Veale & Edwards (2018), on the basis of the above-cited studies on the blind trust of automated logic by human decision-makers (Sktika, 2000), claim that there is a strong argument that the scope of Article 22 should include also decisions where there is some degree of human involvement, though the extent of this degree is hard to determine. Confirmed by the UK ICO, this interpretation argues that the interpretation of the word “solely” in the context of Art. 22 (1) is intended to cover those automated decision-making processes in which humans exercise no real influence on the outcome of the decision, for example where the result of the profiling or process is not assessed by a person before being formalised as a decision (Information Commissioner’s Office, 2017). Thus, minimal human intervention with no real influence on the outcome of the decision cannot be sufficient to exclude the applicability of Art. 22 (1) (Malgieri & Comandé, 2017), and this might be the case of merely signing the tax assessment notice to be sent to the taxpayer.

Finally, regarding the legal effects or significant effects, it is undoubtably that the decision to proceed to the assessment or to require taxpayers to pay a higher amount of taxes differently from what they had declared (or better not declared) will significantly affect the taxpayers’ sphere. Consequently, taxpayers shall be recognised the right to appeal that decision or more generally, they should have access to a judicial remedy. Admitting that the requirement for Art. 22 (1) is met is fundamental because it will mean that profiling and automated decision making will still be allowed in tax matters if, according to the second paragraph, these activities are authorised by the European Union or member state laws to which the controller is subject to. Moreover, these provisions must lay down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests. Thus, laws providing for the ICT system to run activities such as profiling and automated decision-making shall then lay down the suitable safeguards. These safeguards are actually not described in the text of the Directive but only in the recitals.

Section 3: The need to balance individual privacy rights with the public interest embodied in tax compliance

According to Art. 22 GDPR, profiling and automated decision-making are generally strongly limited. However, the need to balance privacy rights of taxpayers with the public interest embodied in tax compliance required the EU legislator to consider that in many member states, tax administrations’ activities carried out through the use of ICT tools could consist, as it emerges from the previous section, in forms of profiling which might also lead to automated decision making according to the GDPR definitions. Moreover, as reported by international organisations such as the OECD, these instruments represent an efficient lever to prevent and fight tax evasion and consequently, revenue losses.

For this reason, in the GDPR provisions concerning data processing, profiling and automated decision making there are important exceptions to the general rules governing these procedures. Nevertheless, these exceptions must be introduced by legislation and respect the essence of fundamental rights and freedoms (Ehrke-Rabel, 2019b). Indeed, the aim is to safeguard the public interest in which the protection of public revenues from tax evasion behaviours is and must be included.

3.1 Striking a balance in data processing

Starting with data processing, Art. 6 of the GDPR defines the cases where the processing will be considered as lawful. Relevant for the tax law sphere is letter e) which states that the processing of data is lawful if necessary for the performance of a task which is carried out in the public interest or in the exercise of official authority vested in the controller. Moreover, the allowed processing cases contained in Letter f) Art. 22 GDPR will not apply since this point will not find application if the processing is carried out by public authorities in the performance of their tasks, which is the case of tax authorities. However, the lawfulness of the processing in the case of the performance of a task carried out in the public interest or in the exercise of official authority, such as the one carried out by the tax authorities, Art. 6 (3) establishes the need for a legal basis which shall be laid down by: (a) Union law; or (b) Member state law to which the controller is subject and which shall be proportionate to the legitimate public interest aim pursued. The same Art. 6 also contains a series of specific provisions which need to be included in the legal basis for the processing according to Art. 6 (1) lit. e) and which consequently will find application in processing for tax matters as well. Examples of these specific provisions concern the type of processed data, the identification of the data subjects, the purpose limitation, the storage period and the general conditions governing the lawfulness of processing by the controller. Nevertheless, member states can provide for more specific requirements for the processing and other measures to ensure lawful and fair processing. Thus, it might be that a tax law allowing taxpayers’ data processing in one state might offer additional protection to taxpayers’ privacy when compared to that of other member states.

Moreover, regarding the processing of special categories of personal data, the relevant provision in the GDPR is Article 9. Special categories of data include the data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation. The general rule established in Art. 9 (1) prohibits the processing of these data. However, paragraph 2 states exceptions to the application of the first paragraph. Similarly to Art. 6, these exceptions include the case where processing is necessary for reasons of substantial public interest, on the basis of European Union or member state law. Indeed, one reason for substantial public interest is represented by tax compliance and the state´s need to safeguard its resources from tax evasion. However, the exception enshrined in Art. 6, which is relevant also in the field of taxation, is limited by a proportionality test4 which has to take place with reference to the aim pursued. The processing also has to respect the essence of the right to data protection and the law allowing the processing must provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject.

From combining these two articles on processing, it is possible that for tax reasons which are part of the broader “public interest”, member states will process data, including the one belonging to the special categories. Nevertheless, for reasons of public interest this permission must still undergo a proportionality test, and it must provide for safeguards of the fundamental rights and interests of the data subject. However, the safeguards that need to be adopted are not listed or exemplified, therefore it remains quite vague what measures that member state will adopt. Due to the territoriality and the worldwide taxation principles, information gathered for tax purposes might still include racial or ethnic origins, or information on health expenses for obtaining tax exemptions. They might even include information on religious belief such as in the case where states levy so-called “church taxes”5 or in cases where there are tax deductions for donations to religious or charitable organisations.6 Moreover, in most of the tax systems, these pieces of information will be directly provided by the taxpayer or by third parties depending on the type of information.

3.2 Striking a balance in profiling and automated decision making

Regarding profiling, the relevant provision is Art. 22 which, as previously described, specifically establishes the right for the data subject to not be subject to a decision which is solely based on automated processing including profiling which will be able to produce legal effects on the data subjects (or can similarly affect him). However, this provision also provides for limitation to this data subject's right.

According to Recital n. 73, the right to not be subjected to automated decision making and profiling together with the “rights of information, access to and rectification or erasure of personal data, the right to data portability, the right to object, decisions based on profiling, as well as the communication of a personal data breach to a data subject and certain related obligations of the controllers” can be restricted by European Union or member state law in the taxation field. Art. 23 (1) (e) expressly mentions taxation matters as general public interest of the Union. However, Art. 23 (1) establishes that any legislative measure restricting those rights (provided for in Artt. 12 to 22 and Art. 34, as well as Art. 5 in so far as its provisions correspond to the rights and obligations provided for in Artt. 12 to 22) must respect the essence of the fundamental rights and freedoms and must be a necessary and proportionate measure. Additionally, in order to ensure the respect of fundamental rights and freedom which also include the right to privacy, Art. 23 (2) lists a series of information which need to be included in the legislative measure allowing such restrictions:

  1. the purposes of the processing or categories of processing;
  2. the categories of personal data;
  3. the scope of the restrictions introduced;
  4. the safeguards to prevent abuse or unlawful access or transfer;
  5. the specification of the controller or categories of controllers;
  6. the storage periods and the applicable safeguards taking into account the nature, scope and purposes of the processing or categories of processing;
  7. the risks to the rights and freedoms of data subjects; and
  8. the right of data subjects to be informed about the restriction, unless that may be prejudicial to the purpose of the restriction.

Nevertheless, this list can be integrated with other information at member states’ discretion.

On a different note, it might be argued that the information contained in the list of Art. 23(2) could reveal the red flags for when tax authorities are going to assess taxpayers and deprive them of an important instrument when assessing possible tax evasion or tax avoidance schemes. Indeed, by knowing exactly how the information is treated and how the technology works, taxpayers could fill in the tax return, or more generally adopt behaviours, which put to the test the predictive measures adopted by the tax revenue agencies in order to fight back tax evasion and avoidance (Reeves, 2015). As already highlighted by Kroll et al. (2016), the need to keep the decision policy as a secret is useful in preventing strategic gaming within the system. Thus, limiting meaningful information about the logic involved in the ICT tool used by the tax administration shall be considered as legitimate (Ehrke-Rabel, 2019a; Ehrke-Rabel, 2019b). Nonetheless, in my opinion, the information required by Art. 23 (2) is not able to offer a concrete overview of how the system works and therefore should not be considered as endangering the public tasks to be carried out when using these instruments.

Moreover, a second reference to the possible use of profiling and automated decision making can be found in Recital 71. Recital 71, even if differently from the text of the Directive is not legally binding,7 expressly mentions fraud and tax-evasion monitoring as the field where these activities can be authorised by member states’ law. However, despite the non-binding nature of recitals, they can be relevant as supplementary interpretative tools for the identification of safeguards which need to be included in the legal basis for profiling and automated decision making as stated in Art. 22. In fact, on the content of those safeguards, Recital 71 establishes that “In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision” and that in order to ensure fair and transparent processing in respect of the data subject, the controller should “implement for the profiling activities appropriate mathematical or statistical procedures and technical and organisational measures appropriate to ensure that in cases of inaccuracies in personal data there is the possibility to corrected them and that the risk of errors is minimized”. Moreover, personal data shall be secured by taking into account the potential risks involved for the interests and rights of the data subject and preventing discriminatory effects on natural persons on the basis of racial or ethnic origin, political opinion, religion or beliefs, trade union membership, genetic or health status or sexual orientation, or that result in measures having such an effect. Furthermore, the recital states that automated decision-making and profiling based on special categories of personal data should be allowed only under specific conditions.

Section 4: Policy implications for EU member states

As emerges from the GDPR provisions, the European legislator has clearly recognised that technologies allowing processing of large amounts of data and profiling (which might also lead to automated decisions) can represent a fundamental tool for tax administration in the fight against tax evasion and fraud. At the same time, the European legislator has attempted to strike a balance between the public interest to protect public revenue and the taxpayers´ data protection rights, by requiring the presence of safeguards in the legislation allowing for the use of such technologies.

From combining these two provisions on data processing (Art. 6 and Art. 9 GDPR), it is possible that for tax reasons which are part of the broader “public interest”, member states will process data, including the one belonging to the special categories. Nevertheless, this permission for reasons of public interest must still undergo a proportionality test, and the permission must provide for safeguards of the fundamental rights and interests of the data subject. Similarly, in the context of automated individual decision-making, including profiling restrictions to the rights of the subject, must respect the essence of the fundamental rights and freedoms and must be a necessary and proportionate measure in a democratic society (Art. 23).

Firstly, from the member states perspective, this means that they shall verify whether the use of ICT tools for carrying out tax administrations activities involve any form of data processing, profiling and automated decision making. If yes, there must be a specific legal basis in place. Indeed, the entrance into force of the GDPR has determined the need for a specific legal basis for ICT instruments such as the ones used by tax administrations through which data are processed, profiles are created, and automated decisions are taken. Secondly, if the use of these tools already has a legal basis or in cases where member states will need to adopt a new piece of legislation allowing the use of these instruments by tax administrations, these provisions must include the required safeguards as prescribed by the GDPR.

Nevertheless, due to the fact that these safeguards tend to be very vague, the GDPR leaves a lot of discretion to member states on the level of protection of taxpayers’ privacy. Indeed, the GDPR provides only for a minimum level of protection to be included in member states’ legislation allowing the use of ICT tools for profiling and automated decision making in tax matters. Thus, member states could increase the level of protection at their discretion. However, different margins in how to extend the scope of the safeguards might lead to misalignments in the way taxpayers’ privacy is protected among EU member states. Moreover, the lack of both a common auditing system in the European Union and of a common instrument ensuring taxpayers’ rights, such as a European Taxpayer Code (EU Commission, 2016) or Charter (CFE, 2018), intensifies even more the possible discrepancies in the level of protection of taxpayers data and privacy among member states.

Conclusions

In recent years, the use of ICTs by tax authorities has efficiently improved their abilities to carry out their tasks (e.g., tax monitoring, taxpayers’ auditing, tax collection) for the public interest. For this reason, investment in ICT for revenue agencies has been highlighted as a priority by many international institutions (OECD, 2016a; OECD, 2016b; Cotton & Dark, 2017). Using new technologies has simplified the ways in which tax administrations can assess taxpayers and individuate those who are tax evaders. However, if on the one hand tax authorities need to be provided with the most efficient instruments in order to prevent and fight tax evasion and tax avoidance, on the other hand, this need must be balanced with privacy rights of the taxpayers.

More specifically, ICT tools (including and in particular risk management systems) are able to combine data provided by third parties and by the taxpayers, process them in order to categorise taxpayers on the basis of their compliance risks and finally, based on their profiles, individuate the taxpayers that will be subjected to audits. The way in which these systems operate perfectly match the definitions of data processing, profiling and automated decision making contained in the GDPR. However, from analysing the text of the GDPR, it emerges that tax authorities, because of the public interests they are fulfilling, are enabled to use ICT instruments which might facilitate, also through profiling and data matching, the carrying out of tax authorities’ tasks. First of all, this means that member states will have to adopt (where not already in place) a legal basis allowing tax authorities to use ICT tools performing profiling and automated decision making. Secondly, according to Recital n. 71 of the GDPR, the legislative measures authorising decision-making based on profiling for fraud and tax-evasion monitoring shall provide the data subject the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision (De Raedt, 2018). However, the text of the regulation itself does not provide for the express indication or description of the safeguards mentioned in Art. 22. Differently, Art. 23 (2) with regard to automated decision making, provides for a list of information which shall be contained in the legislative measure adopted for permitting the use of automated decision making by tax authorities. Nevertheless, the presence of these requirements in the law and in the ICT systems effectively used by tax administrations needs to be assessed on a case-by-case basis at the national level. Indeed, the GDPR, by requiring the inclusion of these safeguards, only offers a minimal level of protection that might be extended at the national level. Moreover, the vagueness of these safeguards as indicated in the GDPR text and the discretion left to member states on the implementation of those, may lead to an even wider gap between different levels of taxpayer protection across member states.

Acknowledgements

I thank the reviewers and editors for their insightful comments and Professor Tina Ehrke-Rabel for the valuable and inspiring discussions.

References

Article 29 Working Party (2017). Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679.  As last Revised and Adopted on 6 February 2018, Retrieved from https://ec.europa.eu/newsroom/article29/document.cfm?action=display&doc_id=49826

Bygrave, L. A. (2001). Minding the Machine: Article 15 of the EC Data Protection Directive and Automated Profiling. Computer Law & Security Report, 17(1), 17–24. doi:10.1016/S0267-3649(01)00104-2

CFE. (2018). Opinion Statement CFE 1/2018 on the Importance of Taxpayer Rights, Codes and Charters on Tax Good Governance. Retrieved from https://taxadviserseurope.org/wp-content/uploads/2018/06/CFE-Opinion-Statement-on-the-Importance-of-Taxpayer-Rights-Codes-and-Charters-on-Tax-Good-Governance-1.pdf

Chatama Y. J. (2013). The impact of ICT on Taxation: the case of Large Taxpayer Department of Tanzania Revenue Authority. Developing Country Studies, 3(2), 91–100. Retrieved from https://iiste.org/Journals/index.php/DCS/article/view/4258

Cotton, M., & Dark, G. (2017, March). Use of Technology in Tax Administrations 1: Developing an Information Technology Strategic Plan [Technical Note]. Washington, DC: International Monetary Fund. doi:10.5089/9781475583601.005

Coudert, F. (2010). When video cameras watch and screen: Privacy implications of pattern recognition technologies. Computer Law and Security Review, 26(4), 377–384. doi:10.1016/j.clsr.2010.03.007

De Raedt, S. (2018). The Impact of the GDPR for the Belgian Tax Authorities. Revue du Droit des Technologies de l’Information, 66-67, 129–143.

Doran, M. (2009). Tax Penalties and Tax Compliance. Harvard Journal on Legislation, 46(1), 111–161.

Edwards-Dowe, D. (2008). E-Filing and E-Payments – The Way Forward. Presented at the

Caribbean Organization of Tax Administration (COTA) General Assembly, Belize City, Belize. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.527.5799&rep=rep1&type=pdf 

Veale, M. & Edwards, L. (2018). Clarity, surprises, and further questions in the Article 29 Working Party draft guidance on automated decision-making and profiling. Computer Law & Security Review, 34(2), 398–404. doi: 10.1016/j.clsr.2017.12.002

Ehrke-Rabel, T. (2019a). Big data in tax collection and enforcement. In W. Haslehner, G. Kofler, K. Pantazatou, & A. Rust (Eds.), Tax and the Digital Economy: Challenges and Proposals for Reform. Alphen aan den Rijn: Kluver Law International.

Ehrke-Rabel, T. (2019b). Profiling im Steuervollzug. FinanzRundschau, 101(2), 45–58.

Ehrke-Rabel, T. (2019c). Third Parties as Supplementary Sources of Tax Transparency. In F. Busran & J. Hey (Eds.), Tax Transparency. Amsterdam: IBFD.

EU Commission, Directorate General Taxation and Customs Union, Fiscalis Risk Analysis Project Group. (2006). Risk Management Guide for Tax Administrations. Retrieved from https://ec.europa.eu/taxation_customs/sites/taxation/files/resources/documents/taxation/tax_cooperation/gen_overview/risk_management_guide_for_tax_administrations_en.pdf

EU Commission (2016). Guidelines for a Model for a European Taxpayers’ Code. Retrieved from https://ec.europa.eu/taxation_customs/business/tax-cooperation-control/guidelines-model-european-taxpayers-code_en

Gutwirth, S., & Hildebrandt, M. (2010). Some caveats on profiling. In S. Gutwirth, Y. Poullet, & P. De Hert, (Eds.) Data protection in a profiled world. Dodrecht: Springer. doi:10.1007/978-90-481-8865-9_2

Hatfield, M. (2015). Taxation and surveillance: an agenda. Yale Journal of Law & Technology 17, 319–367. Retrieved from https://yjolt.org/taxation-and-surveillance-agenda

Information Commissioner’s Office (ICO). (2017). Feedback request – profiling and automated decision-making. Retrieved from https://ico.org.uk/about-the-ico/ico-and-stakeholder-consultations/feedback-request-profiling-and-automated-decision-making/

Internal Revenue Service (IRS). (2012). Annual Report to Congress. National Taxpayer Advocate. Retrieved from https://taxpayeradvocate.irs.gov/2012-Annual-Report/FY-2012-Annual-Report-To-Congress-Full-Report.html

Intra-European Organisation of Tax Administrations (IOTA). (2018). Impact of Digitalisation on the Transformation of Tax Administrations. Retrieved from https://www.iota-tax.org/publication/impact-digitalisation-transformation-tax-administrations-0;

Jensen, J., & Wöhlbier, F. (2012), Improving tax governance in EU Member States: Criteria for successful policies [European Commission Occasional Paper No. 14]. Retrieved from https://ec.europa.eu/economy_finance/publications/occasional_paper/2012/pdf/ocp114_en.pdf;

Kamarinou, D., Millard, C., & Singh, J. (2017), Machine Learning with Personal Data. In R. Leenes, R. van Brakel, S. Gutwirth, & P. De Hert, (Eds.), Data Protection and Privacy. The Age of Intelligent Machines. Sidney: Hart Publishing.

Kariuki, E. (2014), Automation in Tax Administration. Towards sustainable ICT systems in tax administrations. [APRIL Publication No. 4]. Nairobi: African Policy Research Institute Limited. Retrieved from http://www.april-ssa.com/assets/april--automation-in-tax-administrations.pdf

Kroll, J. A., Huey J., Barocas, S., Felten, E. W., Reidenberg, J. R., Robinson, D. G. & Yu, H. (2016). Accountable Algorithms. University of Pennsylvania Law Review, 165(3), 633–705. Retrieved from https://scholarship.law.upenn.edu/penn_law_review/vol165/iss3/3

Lederman, L. (2007). Statutory Speed Bumps: The Roles Third Parties Play in Tax Compliance. Stanford Law Review, 60(3), 695–743. Retrieved from https://www.stanfordlawreview.org/print/article/statutory-speed-bumps-the-roles-third-parties-play-in-tax-compliance/

Lipniewicz, R. (2017). Tax Administration and Risk Management in the Digital Age. Information Systems in Management, 6(1), 26–37. Available at http://yadda.icm.edu.pl/yadda/element/bwmeta1.element.ekon-element-000171468955

Malgieri, G., & Comandé, G. (2017). Why a Right to Legibility of Automated Decision-Making Exists in the General Data Protection Regulation. International Data Privacy Law, 7(4), 243–265. doi:10.1093/idpl/ipx019

Mantelero, A. (2016). Personal data for decisional purposes in the age of analytics: From an individual to a collective dimension of Data Protection. Computer Law & Security Review, 32(2), 238–255. doi:10.1016/j.clsr.2016.01.014

Martini, M. (2017). DS-GVO Art. 22 Automatisierte Entscheidungen im Einzelfall einschließlich Profiling [GDPR Art. 22 Automated Decisions in individual cases including profile]. In B. Paal, & D. Pauly, (Eds.), Datenschutz-Grundverordnung, (pp. 260–264). Munich: C.H. Beck.

OECD. (2006). Using Third Party Information Reports to Assist Taxpayers Meet their Return Filing Obligations— Country Experiences With the Use of Pre-populated Personal Tax Returns [Information Note]. Organisation for Economic Co-operation and Development. Retrieved from https://www.oecd.org/tax/administration/36280368.pdf

OECD. (2015). Addressing the Tax Challenges of the Digital Economy, Action 1 BEPS - 2015 Final Report. Organisation for Economic Co-operation and Development. Retrieved from https://www.oecd.org/ctp/addressing-the-tax-challenges-of-the-digital-economy-action-1-2015-final-report-9789264241046-en.htm

OECD. (2016a). Advanced Analytics for Better Tax Administration. Organisation for Economic Co-operation and Development. Retrieved from https://www.oecd.org/publications/advanced-analytics-for-better-tax-administration-9789264256453-en.htm;

OECD. (2016b). Technologies for Better Tax Administration: A Practical Guide for Revenue Bodies. Organisation for Economic Co-operation and Development. Retrieved from https://www.oecd.org/publications/technologies-for-better-tax-administration-9789264256439-en.htm;

OECD. (2017). The Changing Tax Compliance Environment and the Role of Tax Audit. Organisation for Economic Co-operation and Development. Retrieved from https://www.oecd.org/ctp/the-changing-tax-compliance-environment-and-the-role-of-audit-9789264282186-en.htm;

OECD. (2019). Unlocking the Digital Economy – A Guide to Implementing Application Programming Interfaces in Government. Organisation for Economic Co-operation and Development. Retrieved from http://www.oecd.org/ctp/unlocking-the-digital-economy-guide-to-implementing-application-programming-interfaces-in-government.htm;

Ohm, P. (2010). Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization. UCLA Law Review, 57(6), 1701–1777. Retrieved from https://www.uclalawreview.org/broken-promises-of-privacy-responding-to-the-surprising-failure-of-anonymization-2/

Reeves, J. (2015, March 15). IRS Red Flags: How to Avoid a Tax Audit. Retrieved from USA Today http://www.usatoday.com/story/money/personalfinance/2014/03/15/irs-tax-audit/5864023;

Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). (2016, May 4). Retrieved from https://eur-lex.europa.eu/eli/reg/2016/679/oj;

Russell, B. (2010). Revenue Administration: Developing a Taxpayer Compliance Program [Technical Notes]. Washington, DC: International Monetary Fund. Retrieved from https://www.imf.org/external/pubs/ft/tnm/2010/tnm1017.pdf

Skitka, L. J., Mosier, K., & Burdick, M. D. (2000). Accountability and automation bias. International Journal of Human-Computer Studies, 52(4), 701–717. doi:https://doi.org/10.1006/ijhc.1999.0349

Smith, W. H. (1969). Automation in Tax Administration. Law and Contemporary Problems, 34 (4), 751–768. doi:10.2307/1190909

Tomar, L., Guicheney, W., Kyarisiima, H. & Zimani, T., (2016). Big Data in the Public Sector: Selected Applications and Lessons Learned [Discussion Paper No. IDB-DP-483]. New York: Inter-American Development Bank. Retrieved from https://publications.iadb.org/en/big-data-public-sector-selected-applications-and-lessons-learned;

Vaillancourt, F., Evans, C., Tran-Nam, B., Verdonck, M., Erard, B., & Duran-Cabre, J., (2011), Prefilled Personal Income Tax Returns: A Comparative Analysis of Australia, Belgium, California, Quebec and Spain. Fraser Institute. Retrieved from https://www.fraserinstitute.org/sites/default/files/prefilled-personal-income-tax-returns.pdf

Wachter, S., Mittelstadt, B. & Floridi, L. (2017). Why a right to explanation of automated decision-making does not exist in the General Data Protection Regulation. International Data Privacy Law, 7(2), 76–99. doi:10.1093/idpl/ipx005

World Bank. (2016). World Development Report 2016: Digital Dividends Overview. Retrieved from https://www.worldbank.org/en/publication/wdr2016.

Footnotes

1. According to the OECD, Tax Administration 2017: Comparative Information on OECD and other Advanced and Emerging Economies (2017) p. 191, most tax administration systems are currently still based on a system requiring the taxpayer to fill his tax return.

2. Previous studies show that 97% of the taxpayers‘ information is provided to the IRS in routine reports from third parties (IRS, 2012 as cited by Hatfield, 2015).

3. BGH (German Federal Court) 28. 1. 2014 - VI ZR 156/13 (LG Gießen, AG Gießen), p. 169.

4. As described in his opinion by Advocate General Saugmandsgaard øe explained in ECJ joined cases C-203/15 and C-698/15, Tele2 Sverige AB, 21 December 2016, ECLI:EU:C:2016:572 para. 247, the “requirement of proportionality within a democratic society - or proportionality stricto sensu- flows both from Article 15(1) of Directive 2002/58 and Article 52(1) of the Charter, as well as from settled case-law: it has been consistently held that a measure which interferes with fundamental rights may be regarded as proportionate only if the disadvantages caused are not disproportionate to the aims pursued”. Other relevant case law on the proportionality test in the context of data protection is: ECJ Case C-275/06, Productores de Música de España (Promusicae) v Telefónica de España SAU, 29 January 2008, ECLI:EU:C:2008:54, para. 68; ECJ joined cases C-293/12 and C-594/12, Digital Rights Ireland, 8 April 2014, ECLI:EU:C:2014:238 ; ECJ joined cases C-203/15 and C-698/15, Tele2 Sverige AB, 21 December 2016, ECLI:EU:C:2016:572; ECJ Case C-83/14 Razpredelenie Bulgaria Ad, 16 July 2015, ECLI:EU:C:2015:480; ECJ Case C-362/14 Schrems, 6 October 2015, ECLI:EU:C:2015:650. (EDPS, 2019)

5. Mandatory church taxes are levied in Austria, Germany, Finland, Denmark and Sweden (PEW, 2019).

6. According to previous studies, tax deduction schemes are in place in 9 of the 14 European nations (not only limited to EU member states but also including Switzerland) offering tax incentives on individual donations. These states include Austria, the Czech Republic, Germany, Italy (which also offers tax credits), Netherlands and Switzerland. Tax deductions can also be facilitated through the percentage allocation schemes, which is in use in Slovakia and Slovenia. In this case, a fixed percentage of income tax can be donated directly to charity from a tax return or statement. Meanwhile, donors in Belgium, France, Italy, Norway and Spain can claim a tax credit against the value of their donations (EFA, 2018). On the possibility of tax deductions to charitable organisations within the EU, please see Case C C-318/07, Hein Persche v Finanzamt Lüdenscheid, 27 January 2009, ECLI:EU:C:2009:33.

7. Recitals are not legally binding. However, they might perform a supplementary normative role which the European Commission has confirmed and even if the European Court of Justice has explained that they do not have autonomous legal effect and “cannot be relied upon to interpret in a manner clearly contrary to is wording”, this still does not undermine their supplementary interpretative nature (Malgieri & Comandé, 2017). ECJ, Case C-308/97, Manfredi, 25 November 1998, ECLI:EU:C:1998:566, para. 30. See also Case C-136/04, Deutsche Milch-Kontor, 24 November 2005, ECLI:EU:C:2005:716, para 32; Case C-134/08 Tyson Parketthandel, 2 April 2009, ECLI:EU:C:2009:229, para. 16; Case C-7/11, Caronna, 28 June 2012, ECLI:EU:C:2012:396, para 40.

Add new comment