Citation and metadata
Recommended citation
Guido Noto La Diega, Against the Dehumanisation of Decision-Making – Algorithmic Decisions at the Crossroads of Intellectual Property, Data Protection, and Freedom of Information, 9 (2018) JIPITEC 3 para 1.
Download Citation
Endnote
%0 Journal Article %T Against the Dehumanisation of Decision-Making – Algorithmic Decisions at the Crossroads of Intellectual Property, Data Protection, and Freedom of Information %A Noto La Diega, Guido %J JIPITEC %D 2018 %V 9 %N 1 %@ 2190-3387 %F noto la diega2018 %X This work presents ten arguments against algorithmic decision-making. These re-volve around the concepts of ubiquitous discretionary interpretation, holistic intu-ition, algorithmic bias, the three black boxes, psychology of conformity, power of sanctions, civilising force of hypocrisy, pluralism, empathy, and technocracy. Nowadays algorithms can decide if one can get a loan, is allowed to cross a bor-der, or must go to prison. Artificial intelligence techniques (natural language pro-cessing and machine learning in the first place) enable private and public deci-sion-makers to analyse big data in order to build profiles, which are used to make decisions in an automated way. The lack of transparency of the algorithmic deci-sion-making process does not stem merely from the characteristics of the relevant techniques used, which can make it impossible to access the rationale of the deci-sion. It depends also on the abuse of and overlap between intellectual property rights (the “legal black box”). In the US, nearly half a million patented inventions concern algorithms; more than 67% of the algorithm-related patents were issued over the last ten years and the trend is increasing. To counter the increased mo-nopolisation of algorithms by means of intellectual property rights (with trade se-crets leading the way), this paper presents three legal routes that enable citizens to ‘open’ the algorithms. First, copyright and patent exceptions, as well as trade se-crets are discussed. Second, the EU General Data Protection Regulation is critical-ly assessed. In principle, data controllers are not allowed to use algorithms to take decisions that have legal effects on the data subject’s life or similarly significantly affect them. However, when they are allowed to do so, the data subject still has the right to obtain human intervention, to express their point of view, as well as to contest the decision. Additionally, the data controller shall provide meaningful in-formation about the logic involved in the algorithmic decision. Third, this paper critically analyses the first known case of a court using the access right under the freedom of information regime to grant an injunction to release the source code of the computer program that implements an algorithm. Only an integrated ap-proach – which takes into account intellectual property, data protection, and free-dom of information – may provide the citizen affected by an algorithmic decision of an effective remedy as required by the Charter of Fundamental Rights of the EU and the European Convention on Human Rights. %L 340 %K Algorithmic decision-making %K Data Protection Act 2018 %K GDPR %K algorithmic accountability %K algorithmic bias %K algorithmic governance %K algorithmic transparency %K freedom of information request %K patent infringement defences %K right not to be subject to an algorithmic decision %K software copyright exceptions %U http://nbn-resolving.de/urn:nbn:de:0009-29-46778 %P 3-34Download
Bibtex
@Article{notoladiega2018, author = "Noto La Diega, Guido", title = "Against the Dehumanisation of Decision-Making -- Algorithmic Decisions at the Crossroads of Intellectual Property, Data Protection, and Freedom of Information", journal = "JIPITEC", year = "2018", volume = "9", number = "1", pages = "3--34", keywords = "Algorithmic decision-making; Data Protection Act 2018; GDPR; algorithmic accountability; algorithmic bias; algorithmic governance; algorithmic transparency; freedom of information request; patent infringement defences; right not to be subject to an algorithmic decision; software copyright exceptions", abstract = "This work presents ten arguments against algorithmic decision-making. These re-volve around the concepts of ubiquitous discretionary interpretation, holistic intu-ition, algorithmic bias, the three black boxes, psychology of conformity, power of sanctions, civilising force of hypocrisy, pluralism, empathy, and technocracy. Nowadays algorithms can decide if one can get a loan, is allowed to cross a bor-der, or must go to prison. Artificial intelligence techniques (natural language pro-cessing and machine learning in the first place) enable private and public deci-sion-makers to analyse big data in order to build profiles, which are used to make decisions in an automated way. The lack of transparency of the algorithmic deci-sion-making process does not stem merely from the characteristics of the relevant techniques used, which can make it impossible to access the rationale of the deci-sion. It depends also on the abuse of and overlap between intellectual property rights (the ``legal black box''). In the US, nearly half a million patented inventions concern algorithms; more than 67{\%} of the algorithm-related patents were issued over the last ten years and the trend is increasing. To counter the increased mo-nopolisation of algorithms by means of intellectual property rights (with trade se-crets leading the way), this paper presents three legal routes that enable citizens to `open' the algorithms. First, copyright and patent exceptions, as well as trade se-crets are discussed. Second, the EU General Data Protection Regulation is critical-ly assessed. In principle, data controllers are not allowed to use algorithms to take decisions that have legal effects on the data subject's life or similarly significantly affect them. However, when they are allowed to do so, the data subject still has the right to obtain human intervention, to express their point of view, as well as to contest the decision. Additionally, the data controller shall provide meaningful in-formation about the logic involved in the algorithmic decision. Third, this paper critically analyses the first known case of a court using the access right under the freedom of information regime to grant an injunction to release the source code of the computer program that implements an algorithm. Only an integrated ap-proach -- which takes into account intellectual property, data protection, and free-dom of information -- may provide the citizen affected by an algorithmic decision of an effective remedy as required by the Charter of Fundamental Rights of the EU and the European Convention on Human Rights.", issn = "2190-3387", url = "http://nbn-resolving.de/urn:nbn:de:0009-29-46778" }Download
RIS
TY - JOUR AU - Noto La Diega, Guido PY - 2018 DA - 2018// TI - Against the Dehumanisation of Decision-Making – Algorithmic Decisions at the Crossroads of Intellectual Property, Data Protection, and Freedom of Information JO - JIPITEC SP - 3 EP - 34 VL - 9 IS - 1 KW - Algorithmic decision-making KW - Data Protection Act 2018 KW - GDPR KW - algorithmic accountability KW - algorithmic bias KW - algorithmic governance KW - algorithmic transparency KW - freedom of information request KW - patent infringement defences KW - right not to be subject to an algorithmic decision KW - software copyright exceptions AB - This work presents ten arguments against algorithmic decision-making. These re-volve around the concepts of ubiquitous discretionary interpretation, holistic intu-ition, algorithmic bias, the three black boxes, psychology of conformity, power of sanctions, civilising force of hypocrisy, pluralism, empathy, and technocracy. Nowadays algorithms can decide if one can get a loan, is allowed to cross a bor-der, or must go to prison. Artificial intelligence techniques (natural language pro-cessing and machine learning in the first place) enable private and public deci-sion-makers to analyse big data in order to build profiles, which are used to make decisions in an automated way. The lack of transparency of the algorithmic deci-sion-making process does not stem merely from the characteristics of the relevant techniques used, which can make it impossible to access the rationale of the deci-sion. It depends also on the abuse of and overlap between intellectual property rights (the “legal black box”). In the US, nearly half a million patented inventions concern algorithms; more than 67% of the algorithm-related patents were issued over the last ten years and the trend is increasing. To counter the increased mo-nopolisation of algorithms by means of intellectual property rights (with trade se-crets leading the way), this paper presents three legal routes that enable citizens to ‘open’ the algorithms. First, copyright and patent exceptions, as well as trade se-crets are discussed. Second, the EU General Data Protection Regulation is critical-ly assessed. In principle, data controllers are not allowed to use algorithms to take decisions that have legal effects on the data subject’s life or similarly significantly affect them. However, when they are allowed to do so, the data subject still has the right to obtain human intervention, to express their point of view, as well as to contest the decision. Additionally, the data controller shall provide meaningful in-formation about the logic involved in the algorithmic decision. Third, this paper critically analyses the first known case of a court using the access right under the freedom of information regime to grant an injunction to release the source code of the computer program that implements an algorithm. Only an integrated ap-proach – which takes into account intellectual property, data protection, and free-dom of information – may provide the citizen affected by an algorithmic decision of an effective remedy as required by the Charter of Fundamental Rights of the EU and the European Convention on Human Rights. SN - 2190-3387 UR - http://nbn-resolving.de/urn:nbn:de:0009-29-46778 ID - noto la diega2018 ER -Download
Wordbib
<?xml version="1.0" encoding="UTF-8"?> <b:Sources SelectedStyle="" xmlns:b="http://schemas.openxmlformats.org/officeDocument/2006/bibliography" xmlns="http://schemas.openxmlformats.org/officeDocument/2006/bibliography" > <b:Source> <b:Tag>noto la diega2018</b:Tag> <b:SourceType>ArticleInAPeriodical</b:SourceType> <b:Year>2018</b:Year> <b:PeriodicalTitle>JIPITEC</b:PeriodicalTitle> <b:Volume>9</b:Volume> <b:Issue>1</b:Issue> <b:Url>http://nbn-resolving.de/urn:nbn:de:0009-29-46778</b:Url> <b:Pages>3-34</b:Pages> <b:Author> <b:Author><b:NameList> <b:Person><b:Last>Noto La Diega</b:Last><b:First>Guido</b:First></b:Person> </b:NameList></b:Author> </b:Author> <b:Title>Against the Dehumanisation of Decision-Making – Algorithmic Decisions at the Crossroads of Intellectual Property, Data Protection, and Freedom of Information</b:Title> <b:Comments>This work presents ten arguments against algorithmic decision-making. These re-volve around the concepts of ubiquitous discretionary interpretation, holistic intu-ition, algorithmic bias, the three black boxes, psychology of conformity, power of sanctions, civilising force of hypocrisy, pluralism, empathy, and technocracy. Nowadays algorithms can decide if one can get a loan, is allowed to cross a bor-der, or must go to prison. Artificial intelligence techniques (natural language pro-cessing and machine learning in the first place) enable private and public deci-sion-makers to analyse big data in order to build profiles, which are used to make decisions in an automated way. The lack of transparency of the algorithmic deci-sion-making process does not stem merely from the characteristics of the relevant techniques used, which can make it impossible to access the rationale of the deci-sion. It depends also on the abuse of and overlap between intellectual property rights (the “legal black box”). In the US, nearly half a million patented inventions concern algorithms; more than 67% of the algorithm-related patents were issued over the last ten years and the trend is increasing. To counter the increased mo-nopolisation of algorithms by means of intellectual property rights (with trade se-crets leading the way), this paper presents three legal routes that enable citizens to ‘open’ the algorithms. First, copyright and patent exceptions, as well as trade se-crets are discussed. Second, the EU General Data Protection Regulation is critical-ly assessed. In principle, data controllers are not allowed to use algorithms to take decisions that have legal effects on the data subject’s life or similarly significantly affect them. However, when they are allowed to do so, the data subject still has the right to obtain human intervention, to express their point of view, as well as to contest the decision. Additionally, the data controller shall provide meaningful in-formation about the logic involved in the algorithmic decision. Third, this paper critically analyses the first known case of a court using the access right under the freedom of information regime to grant an injunction to release the source code of the computer program that implements an algorithm. Only an integrated ap-proach – which takes into account intellectual property, data protection, and free-dom of information – may provide the citizen affected by an algorithmic decision of an effective remedy as required by the Charter of Fundamental Rights of the EU and the European Convention on Human Rights.</b:Comments> </b:Source> </b:Sources>Download
ISI
PT Journal AU Noto La Diega, G TI Against the Dehumanisation of Decision-Making – Algorithmic Decisions at the Crossroads of Intellectual Property, Data Protection, and Freedom of Information SO JIPITEC PY 2018 BP 3 EP 34 VL 9 IS 1 DE Algorithmic decision-making; Data Protection Act 2018; GDPR; algorithmic accountability; algorithmic bias; algorithmic governance; algorithmic transparency; freedom of information request; patent infringement defences; right not to be subject to an algorithmic decision; software copyright exceptions AB This work presents ten arguments against algorithmic decision-making. These re-volve around the concepts of ubiquitous discretionary interpretation, holistic intu-ition, algorithmic bias, the three black boxes, psychology of conformity, power of sanctions, civilising force of hypocrisy, pluralism, empathy, and technocracy. Nowadays algorithms can decide if one can get a loan, is allowed to cross a bor-der, or must go to prison. Artificial intelligence techniques (natural language pro-cessing and machine learning in the first place) enable private and public deci-sion-makers to analyse big data in order to build profiles, which are used to make decisions in an automated way. The lack of transparency of the algorithmic deci-sion-making process does not stem merely from the characteristics of the relevant techniques used, which can make it impossible to access the rationale of the deci-sion. It depends also on the abuse of and overlap between intellectual property rights (the “legal black box”). In the US, nearly half a million patented inventions concern algorithms; more than 67% of the algorithm-related patents were issued over the last ten years and the trend is increasing. To counter the increased mo-nopolisation of algorithms by means of intellectual property rights (with trade se-crets leading the way), this paper presents three legal routes that enable citizens to ‘open’ the algorithms. First, copyright and patent exceptions, as well as trade se-crets are discussed. Second, the EU General Data Protection Regulation is critical-ly assessed. In principle, data controllers are not allowed to use algorithms to take decisions that have legal effects on the data subject’s life or similarly significantly affect them. However, when they are allowed to do so, the data subject still has the right to obtain human intervention, to express their point of view, as well as to contest the decision. Additionally, the data controller shall provide meaningful in-formation about the logic involved in the algorithmic decision. Third, this paper critically analyses the first known case of a court using the access right under the freedom of information regime to grant an injunction to release the source code of the computer program that implements an algorithm. Only an integrated ap-proach – which takes into account intellectual property, data protection, and free-dom of information – may provide the citizen affected by an algorithmic decision of an effective remedy as required by the Charter of Fundamental Rights of the EU and the European Convention on Human Rights. ERDownload
Mods
<mods> <titleInfo> <title>Against the Dehumanisation of Decision-Making – Algorithmic Decisions at the Crossroads of Intellectual Property, Data Protection, and Freedom of Information</title> </titleInfo> <name type="personal"> <namePart type="family">Noto La Diega</namePart> <namePart type="given">Guido</namePart> </name> <abstract>This work presents ten arguments against algorithmic decision-making. These re-volve around the concepts of ubiquitous discretionary interpretation, holistic intu-ition, algorithmic bias, the three black boxes, psychology of conformity, power of sanctions, civilising force of hypocrisy, pluralism, empathy, and technocracy. Nowadays algorithms can decide if one can get a loan, is allowed to cross a bor-der, or must go to prison. Artificial intelligence techniques (natural language pro-cessing and machine learning in the first place) enable private and public deci-sion-makers to analyse big data in order to build profiles, which are used to make decisions in an automated way. The lack of transparency of the algorithmic deci-sion-making process does not stem merely from the characteristics of the relevant techniques used, which can make it impossible to access the rationale of the deci-sion. It depends also on the abuse of and overlap between intellectual property rights (the “legal black box”). In the US, nearly half a million patented inventions concern algorithms; more than 67% of the algorithm-related patents were issued over the last ten years and the trend is increasing. To counter the increased mo-nopolisation of algorithms by means of intellectual property rights (with trade se-crets leading the way), this paper presents three legal routes that enable citizens to ‘open’ the algorithms. First, copyright and patent exceptions, as well as trade se-crets are discussed. Second, the EU General Data Protection Regulation is critical-ly assessed. In principle, data controllers are not allowed to use algorithms to take decisions that have legal effects on the data subject’s life or similarly significantly affect them. However, when they are allowed to do so, the data subject still has the right to obtain human intervention, to express their point of view, as well as to contest the decision. Additionally, the data controller shall provide meaningful in-formation about the logic involved in the algorithmic decision. Third, this paper critically analyses the first known case of a court using the access right under the freedom of information regime to grant an injunction to release the source code of the computer program that implements an algorithm. Only an integrated ap-proach – which takes into account intellectual property, data protection, and free-dom of information – may provide the citizen affected by an algorithmic decision of an effective remedy as required by the Charter of Fundamental Rights of the EU and the European Convention on Human Rights.</abstract> <subject> <topic>Algorithmic decision-making</topic> <topic>Data Protection Act 2018</topic> <topic>GDPR</topic> <topic>algorithmic accountability</topic> <topic>algorithmic bias</topic> <topic>algorithmic governance</topic> <topic>algorithmic transparency</topic> <topic>freedom of information request</topic> <topic>patent infringement defences</topic> <topic>right not to be subject to an algorithmic decision</topic> <topic>software copyright exceptions</topic> </subject> <classification authority="ddc">340</classification> <relatedItem type="host"> <genre authority="marcgt">periodical</genre> <genre>academic journal</genre> <titleInfo> <title>JIPITEC</title> </titleInfo> <part> <detail type="volume"> <number>9</number> </detail> <detail type="issue"> <number>1</number> </detail> <date>2018</date> <extent unit="page"> <start>3</start> <end>34</end> </extent> </part> </relatedItem> <identifier type="issn">2190-3387</identifier> <identifier type="urn">urn:nbn:de:0009-29-46778</identifier> <identifier type="uri">http://nbn-resolving.de/urn:nbn:de:0009-29-46778</identifier> <identifier type="citekey">noto la diega2018</identifier> </mods>Download
Full Metadata
Bibliographic Citation | Journal of intellectual property, information technology and electronic commerce law 9 (2018) 1 |
---|---|
Title |
Against the Dehumanisation of Decision-Making – Algorithmic Decisions at the Crossroads of Intellectual Property, Data Protection, and Freedom of Information (eng) |
Author | Guido Noto La Diega |
Language | eng |
Abstract | This work presents ten arguments against algorithmic decision-making. These re-volve around the concepts of ubiquitous discretionary interpretation, holistic intu-ition, algorithmic bias, the three black boxes, psychology of conformity, power of sanctions, civilising force of hypocrisy, pluralism, empathy, and technocracy. Nowadays algorithms can decide if one can get a loan, is allowed to cross a bor-der, or must go to prison. Artificial intelligence techniques (natural language pro-cessing and machine learning in the first place) enable private and public deci-sion-makers to analyse big data in order to build profiles, which are used to make decisions in an automated way. The lack of transparency of the algorithmic deci-sion-making process does not stem merely from the characteristics of the relevant techniques used, which can make it impossible to access the rationale of the deci-sion. It depends also on the abuse of and overlap between intellectual property rights (the “legal black box”). In the US, nearly half a million patented inventions concern algorithms; more than 67% of the algorithm-related patents were issued over the last ten years and the trend is increasing. To counter the increased mo-nopolisation of algorithms by means of intellectual property rights (with trade se-crets leading the way), this paper presents three legal routes that enable citizens to ‘open’ the algorithms. First, copyright and patent exceptions, as well as trade se-crets are discussed. Second, the EU General Data Protection Regulation is critical-ly assessed. In principle, data controllers are not allowed to use algorithms to take decisions that have legal effects on the data subject’s life or similarly significantly affect them. However, when they are allowed to do so, the data subject still has the right to obtain human intervention, to express their point of view, as well as to contest the decision. Additionally, the data controller shall provide meaningful in-formation about the logic involved in the algorithmic decision. Third, this paper critically analyses the first known case of a court using the access right under the freedom of information regime to grant an injunction to release the source code of the computer program that implements an algorithm. Only an integrated ap-proach – which takes into account intellectual property, data protection, and free-dom of information – may provide the citizen affected by an algorithmic decision of an effective remedy as required by the Charter of Fundamental Rights of the EU and the European Convention on Human Rights. |
Subject | Algorithmic decision-making, Data Protection Act 2018, GDPR, algorithmic accountability, algorithmic bias, algorithmic governance, algorithmic transparency, freedom of information request, patent infringement defences, right not to be subject to an algorithmic decision, software copyright exceptions |
DDC | 340 |
Rights | DPPL |
URN: | urn:nbn:de:0009-29-46778 |