Transcript
This transcript was autogenerated. To make changes, submit a PR.
Hi everyone, welcome to the session
dedicated to ethics for quant computing. We will go through
the different aspects of ethics in the
technological environment.
Quantum information technologies covering
quantum computing, quantum communication, and quantum sensing
are among the most significant technologies to emerge in
recent decades, offering the promise of
paratic mishifting computational capacity with significant
ethical consequences. On a technical level,
the unique features of quantum information processing
have consequences for the imposition of fairness and ethical
constraints on computation. It may seem too early
to worry about the ethical implication of quantum computing
given the lack of evolution as
to when it will see widespread use
of quantum computing. However,
now is a perfect time. Consider how
quickly machine learning was embedded into business processes
before most understood how damaging
it could be to an organization, customer and reputation.
Without a top down mandate for ethical development
of quantum computing, technologists might meet
business objective or their creations could
lead to unintended ethical fraud consequences.
We can situate quantum ethics at the
cross disciplinary intersection of
quantum information science, technology ethics
and moral philosophy to
assess the impact of this incoming
emerging technology. The four area
that we are going to high level explore
is safety, security, privacy, and fairness.
The development of public key cryptographies
is in the was
revolutionary, enabling new ways of computing
securely. However, public key
algorithms are vulnerable to
quantum attacks because they derive their strength
from the difficulties of solving discrete log problem
or factoring large integers.
As discovered by mathematician Peter Shaw,
these types of problems can be solved very quickly using
a sufficiently strong quantum computer.
So in the case of asymmetric or public key
cryptography, we need new math that will
stand up to quantum attacks because today's public
key algorithms will be completely broken.
Grover's algorithm,
from the scientist love Grover,
is a quantum search algorithm. Using Grover's
algorithm, some symmetric algorithms
are impacted and some are broken.
Key size and message digest size
are important consideration that will
factor into whether an algorithm is quantum safe
or not. For example, use of
advanced encryption standard AES
with 256 bit keys
is considered quantum safe,
but Tripol codes can be
broken no matter the key size.
And that is true for symmetric algorithms that usually
are utilized by the banking industry or
the asymmetric algorithms used
for publicly and private key for
instance, automotive connected cars for vehicle to everything
type of communications will be easily broken.
Just it must be first noted
that publicly crypto system work
against classical computing attacks and
has been traditionally estimated through so
called bits of security level.
Such a level is defined as the effort required by
a classical computer to perform a brute force attack.
For instance, an asymmetric crypto system
has 1024 bit security.
When the effort required to attack it with
a classical computer is similar to the one needed
to carry out a brute force attack on 1024
bit cryptography key. As a reference
in the table here in the chart,
the table indicates the security level of some
of the most popular symmetric and asymmetric
crypto system.
The cost of breaking current 80 bit
security crypto system with classical computers
is estimated to be between tens of thousands
and hundreds of millions of dollars.
In the case of 112 bit crypto system,
they are continued to be secure to classical computing attacks for
the next 30 40 years. However,
researchers have determined that
160 bit elliptic curves can
be broken with a 1000 qubit quantum computer,
while a 1024 bit
RSA will need roughly 2000
qubits. Such a treat affect
not only crypto systems that rely on integer factorization
or elliptic curves, but also orders
based on problems like the discrete logarithm
problem, which can be solved fast through the
shores algorithm.
An example is blockchain and also other distributed ledger
technologies have evolved significantly in the last
years and their use has been suggested for numerous
applications due to their ability to provide transparency,
redundancy, accountability and so on.
In the case of blockchain, such characteristics are
provided through public key cryptography and hash
functions. However, the fast progress of quantum
computing is opening the possibility of performing attacks
based on Grover's enshore's algorithms in the near future.
Such algorithms treat in both public key cryptography
and ash functions, forcing to redesign blockchains
to make use of crypto systems that
withstand quantum attacks,
thus creating which are known as
post quantum, post quantum
proof, quantum safe quantum resistant crypto
systems for such a purpose.
There are several studies going on
to set the state of the art on post quantum crystal system
and how they can apply to blockchains and
DTL.
A definition of quantum shape cryptography is
important. I take as example the one of
european telecommunications standard institute is not the only one.
Cryptography helps to provide security for many
everyday tasks. When you send an email, make an online
purchase or make a withdrawal from an ATM machine,
cryptography helps keep your data
private and authenticate your identity.
Today's modern cryptography algorithms derive
their strength from the difficulties of solving certain math
problems using classical computers or the
difficulty of searching for the right secret
key or message. Quantum computers,
however, work in a fundamentally different way.
Solving a problem that might take millions of years on a classical
computer could take hours or minutes on a sufficiently
large quantum computer,
which will have a significant impact on the encryption,
ashing and public key algorithms we are
using today. This is where quantum
safe cryptography comes in
quantum safe cryptography refers to efforts
to identify algorithms that are resistant
to attacks by both classical and quantum computers
to keep information assets secure even after
a larger scale quantum computer has been built.
The picture over here out of the white
paper number eight of the European
Telecommunication Standard Institute shows that
without specific action in 2025,
the quantum computing techniques will be in
condition to expose exponentially the
risk for the current cryptography solutions.
Just a shot
on a study running in this moment on the Cambridge
Quantum Institute
that is delivering a
quantum cryptography based on the quantum
physics moving from math
into physics using photons
to transmit over fiber optics wires the
binary keys through specific polarizations,
the properties of quantum mechanics particles
can exist in more than one place or state at the
same time, a quantum property cannot be observed
without changing or disturbing it.
Wool particles cannot be copied allows
to create a sort of cryptography
that if a third party between two
communicating intervening
changes or disturbs the communication and that
is easily evident first
and modify the communication so practically
is impossible for a third party to intervene.
It's just a flash. Just to show how the things
are evolving rapidly and in a different path.
Also for cryptography,
let's take a look now on for instance,
what NIST, the US Institute
of Standard of and Technology is
working in this moment. They have launched a
challenge to provide solutions
and they have selected four solution for
general encryption and for digital signature.
The algorithms are designed the selected one
for two main tasks for which encryption
is typically used. General encryption is used to protect information
exchanged across a public network and digital
signature instead is used for identity authentication.
Just to summarize, all four of the algorithms were
created by experts collaborating with multiple countries
and institutions.
For general encryption used when we access secure
websites, NIST has selected
the crystal skyber.
Among its advantage are comparatively
small encryption keys that two parties can exchange
easily as well as its speed of operation
for digital signature, often used when we need to
verify identities during a digital transaction
or to sign a document remotely.
NIST has selected the three algorithms,
crystal dilithium, Falcon and
Sphinx plus. Reviewers noted
that the high efficiency of the first two and NST
recommends crystal's dilithium as
the primary algorithm with Falcon for application
that need a smaller signature tool than delisium can provide.
The third, sphinx plus, is somewhat larger
and slower than the other two, but is valuable
as a backup for one main reason,
it is based on a different math than
all three of NST's order selection
used.
Three of the selected algorithms are based on a
family of math problems called structure,
lattices while sphinx plus uses
hash functions.
There are other algorithms still under consideration that
are defines for general encryption.
Do not use structured lattices or ash function
in their approaches. While the standard is
in development, NIST encourages security experts
to explore the new algorithms and consider how their
application will use them, but not to bake
them into the system yet, as the algorithms
could change slightly before the standard is
finalized is finalized.
Every organization though should prepare to
this evolution. This chart shows
a suggestion out of the IBM introduction to
quantum safe for instance. So not
only public or government organization
are working on it, suggesting customer or
suggesting users to prepare themselves to
the quantum computer arrival. For instance,
when meeting with clients getting started
on their journey to quantum safety,
IBM shares a few of the key milestones
to help them to get ready to adopt new quantum safe standards.
First, discover and classify data
the first step involves classifying the value of
data and understanding compliance requirements.
This helps to create a data inventory,
then the creation of a
crypto inventory because once you
have classified your data, you will need to identify how
your data is encrypted as well as other
uses of cryptography to create
a crypto environment that will help
you during your migration planning. Your crypto inventory
will include information like encryption protocols,
symmetric and asymmetric algorithms, key lengths,
crypto providers and so on.
The third step is to embrace
so called crypto agility.
The transition to quantum safe standards will be a multiyear journey
as standards evolve and vendors
move to adopt a quantum safe technology,
use a flexible approach and be prepared to make replacements
is key, so it's necessary to
implement a hybrid approach as recommended by
several industry experts. By using both
classical and quantum safe cryptographic algorithms. This maintains
compliance with current standards while
adding quantum safe protection.
We have time to implement quantum safe solution before
the advent of large scale quantum computing, but not much
time. Moving to new cryptography is
complex and will require significant time and investment.
We don't know when a large scale quantum computer capable of breaking
public key cryptography algorithms will be available.
Experts predict that could be possible by the
end of this decade. Honestly, IBM has
just released a pay as you go
cloud based quantum capability
at 125 qubits, which serve to be
a significant capability.
So we can expect to rapidly evolve
into a much larger quantum computing available to
everybody. Let's move now into
a different type of ethical concern,
the distribution of computer power. The benefit that
could come with the power of quantum computing are frequently
discussed. The power of quantum
computing can be leveraged for bad purposes as
well as good, and even when organisations have the
best intention, there are potential downsides
that must be considered. For instance,
the access. It is unlikely a typical person or
smaller company will ever own a quantum computer due
to their physical and technical complexity,
but that doesn't mean they can't benefit governments
and organizations. We have seen IBM that want to move
everyone along the technology adoption curve in
an equitable way should think about how
to share knowledge of quantum computing.
The bias and the fairness due
to the beyond classical capability
of quantum computing, quantum machine learning is
applied independently or embedded
in classical models for decision making,
especially in the field of finance.
Fairness and other ethical issues are
often one of the main concern in decision making.
We need to define a formal framework for the fairness,
verification and analysis of quantum machine learning
decision models,
where we adopt one of the most popular
notion of fairness in the literature.
Based on the intuition,
any two similar individuals must
be treated similarly and are
therefore system are unbiased.
Quantum noise can improve fairness and develop an
algorithm to check with whether a
quantum machine learning model is fair. In particular,
there are algorithms that can find bias
kernels of quantum data during
the checking. These bias kernels
generates infinitely many bias pairs
for investigating the unfairness of
the model. For example,
Google has algorithms designed based on a
highly efficient data structure, the tensor networks,
and implemented on
Google's Tensorflow quantum.
The utility and effectiveness of those algorithms
are confirmed by the experimental results,
including income prediction and credit scoring
on real worth data for
a class of random quantum
decision model with just 27
qubit dimensional state tripling
that of the state of the art algorithms
for verifying quantum machine learning models.
As discussed in, an important issue in classical machine learning
is how fair is the
decision made by machines. The same issues
exist for quantum machine learning.
You may see that the fairness of quantum decision model
is to treat all input states
equally. An example there
is not a pair of two close input states that
has a large difference between their corresponding outcomes.
The bias can impact the
performances. Algorithms are
increasingly engaged in economically important decisions.
They are used to make decisions regarding sentences in
criminal courts, resume screening,
pricing, advertising placement,
lending decision, and the news in
the media that citizen consume. This development
has generated a public debate about bias and unfairness
in machine guided decisions, including several high
profile allegations in finance, criminal sentencing,
hiring, advertising targeting and so on.
Fairness concern have
resonated with policymakers in multiple
countries who have adopted or are considering fairness
related regulations for algorithms. We will see
later on the european approach on that,
but the bias are also in data.
When training a quantum system,
there can be potential biases in the data used, which can impact
the performance and fairness of this system.
For instance, sample bias. If the training data
used for quantum systems is not representative of the
diverse range of inputs of scenarios
that the system may encounter in the real world,
it can result in biased outcomes. For example,
if the training data predominantly represents a particular
demographic or specific experimental conditions,
the system may not generalize well to other groups or
situations. The labeling
bias typical of machine learning, the process
of labeling data for training quantum system can introduce
biases. Human annotators
may unintentionally label data based on
their own perspective or preconception,
leading to biased training sets.
This can result in unfair or discriminatory
outcomes when the system is applied to
real world situations and we
should consider also the historical bias. If the training
data reflects historical biases and inequities,
the quantum system may in
advert without us knowing,
learn and perpetuate those biases.
For instance, if historical data exhibits disparities
in representation or opportunities for certain groups,
the trained system may reproduce or amplify such biases,
leading to discriminatory outcomes.
To prevent perpetuating biases and
inequities, it is crucial to
promote diverse and inclusive development processes.
Therefore, the suggestion is
to through the so called representation
includes diverse
practices and voices in the
development of quantum system that helps in
capturing a broader range of experiences and
reducing, not eliminating,
bias. Diverse teams can identify
and rectify biases in the training data,
improving fairness and equity in the system performance.
Ethical guidelines establishing ethical
guidelines and principle during the development process
ensures that biases and inequities
are actively addressed.
These guidelines should promote fairness, transparency,
accountability and should be followed throughout the system
design, training and deployment stages.
They can or should be defined by
any organization internally for
quantum activities,
also artificial intelligence activities, but they stay in quantum
data collection. Collecting a diverse and inclusive data
set that adequately represents the
target population is vital.
It is essential to consider factors like demographic diversity,
socioeconomic background, and regional variation
to ensure the system is trained on a comprehensive
and unbiased data set.
Bias detection and mitigation what
is is regularly assessing
the quantum system for bias and developing techniques
to mitigate them is crucial.
This can involve techniques such as fairness aware learning,
debiasing algorithm, or including
fairness metric during the evaluation process.
We shouldn't forget external review and auditing.
Encouraging external reviews and audits of
quantum system helps in identifying and rectifying
biases. Independent scrutiny ensures
that the system is valued from multiple perspectives
and can help mitigate any biases or
unfairness that may have been missed during
the development process. In summary, diverse and
inclusive development is
necessary to are necessary to prevent
the perpetuation of biases and inequities in
quantum system exist
risk that are exacerbating by
the utilization of quantum data
harvesting and private in the past several
years, there have been major passes to protect
data private and ensure
artificial intelligence technologies are being used fairly
and in a ways that benefit the public.
Despite these off efforts, rampant data collection
still take place. Since future quantum
computers will be able to process large
volumes of data more rapidly than today's most sophisticated
service, the availability of quantum computing
could further incentivize organisations to collect
even more consumer data, therefore supercharging
the data harvesting that already
takes place. But I'm key on
explainability quantum computers, and especially
quantum machine learning, presents the ultimate
black box problem. Machine learning developers are familiar
with this issue and deep learning neural networks
are notoriously opaque. With quantum computers,
explainability is more of a physics
problems than a programming problem. It will
be difficult to evaluate and judge the decision making process of
quantum algorithms because they will recognize even more complex
patterns across even more data points
than today's machine learning models.
Environmentally, we have also to
consider that all these things we are speaking in this moment
are bringing effect.
In particular, quantum computers have the potential to
revolutionize computing capabilities, but they also come with
significant energy requirements and potential environment impacts.
In this chart there is an overview of the energy requirements.
Quantum computers operate at extremely low
temperatures to maintain the delicate quantum states
of their qubits. Cooling systems, such as cryogenic
refrigerators, are necessary to achieve and maintain this
low temperature. The cooling process itself consumes
significant amount of energy. Additionally, the computational
operation performed by quantum computers can be
energy intensive, depending on the arbor architecture.
On the complexity of the algorithms being executed,
there is an environmental impact. The energy consumption of quantum
computers can have environmental implication primarily
through greenhouse gas emission and contributions
to climate change. The energy generation
required to power quantum computing facilities can
rely on nonrenewable sources like
fossil fuels, which release carbon dioxide
and other greenhouse gases when
burned. Moreover,
the manufacturing processes for quantum computing may
also have environmental impacts due to the
extraction and processing of raw materials
as well as waste generation.
It is important to know that the quantum computers are still in the early
stages of development and their energy efficiency is
not yet on par with classical computers.
Classical computers, which power most
of our current computational infrastructure,
have benefited from decades of optimization
and advancements in energy efficiency.
Quantum computers have a long way to go in
terms of reducing their energy requirements and possibly
improving the energy efficiency to become
more sustainable. So to
mitigate that,
we can pursue several strategies.
Energy efficiency research continued research and
development efforts should focus on improving the energy efficiencies
of quantum computing. This include optimizing
the design architecture of hardware components,
reducing the energy required for cooling, and developing
more efficient algorithms to minimize computational operations.
Renewable energy integration promoting the
use of renewable energy sources such as
solar, wind, and hydroelectric
power to meet the energy demands for
quantum computing can significantly reduce
the environmental impact.
Investing in renewable energy infrastructure ensures
that the energy consumed consumed by quantum computers
comes from sustainable sources,
but lifecycle assessment conducting comprehensive lifecycle
assessment of quantum computing can
help identify and mitigate environmental impacts.
This assessment should consider not only the energy consumption during operation,
but also the energy and the resource required
during manufacturing, transportation, disposal of the hardware,
and so on. That is bringing
to a circular economy. Implementing recycle
and waste management for quantum computing hardware
can minimize the environmental impact associated with
their production disposal.
Emphasizing a circular economy approach
can promote the reuse and recycling of material,
reducing the need for raw material extraction, and minimize waste
generation. Of course,
policies and standard are relevant because government and regulatory
codes can play a crucial role in promoting sustainable
practices in the development and operation
for quantum computing. Implementing energy
efficiency standards, incentivizing the use of renewable energy,
and supporting reserves and development for sustainable
quantum technologies that can have a significant positive impact.
Addressing the energy requirements and environmental impact
of quantum computing is essential for their
long term viability and adoption. By focusing
on energy efficiency, renewable energy integration,
lifecycle assessment, recycling, and policy
initiative, we can work toward ensuring
that quantum computing technology align with global
sustainability goals.
Let's speak briefly on the ethical framework
out of the European Artificial Intelligence act
that has been approved on the 15
June by the European
Community and now is in process to be adopted by the
different countries in Europe, probably before the end of
the year, is focusing
on artificial intelligence, but will have effect also
in quantum computing use for
specifically application. The goal is to enable Europe
to lead a correct approach to any kind of artificial intelligence
and identify prohibited high risk areas
that need to be monitored and should
be regulated to
avoid conflicts and legal problems. In addition,
they want to protect the rights of european citizens.
Where laws are generally stricter and more restrictive
than, for instance, in United States or elsewhere,
there is a clear difficulty in regulating
technological changes. The risk of obsolete
given the speed of technology innovation
is high.
A very well prepared flowchart
that describe all the effect of this
act, including a portion that
can be applies for quantum computing, is in a
flowchart published by Vargas and Salman. I put
over here the reference I would suggest for people interested
to go through that for us we stay in
this session just on the not admitted activities
which are protecting one of
the things we have mentioned at the beginning, mainly the
private and the fairness of the application
of technology. Therefore, real time remote
biometric identification system in
public accessible spaces is not
admitted. Post remote biometric
identification system as well,
categorization based on biometric
using sensitive characteristics,
gender, race, ethnicity and so on are forbidden.
Even predictive policing system based on profiling
location or past criminal behavior.
Emotion recognition system in
law enforcement, border management, workplace and
educational institution. And we
should not forget the indiscriminate
scraping of biometric data from social media.
All these things are not admitted. Of course,
the use of quantum computing may reinforce this
kind of activity, so the influence on quantum computing is
evident. I will recall also
the path of
the so called Rome call for artificial
intelligence ethics, but in reality is referring
to all the technology.
Several scientists in 2022,
philosopher theologician in
bioethics work
out progressively identifying
three areas, edicts, education and
right where technology should be in some way monitored,
and six principles, transparency, inclusion,
accountability, impartiality,
reliability, security and privacy.
And in November 2022, they met again altogether
in a workshop called converging on the person
emerging technologies for the common good that
is called the Rome Treaty.
There is a clear awareness of the critic
situation of our relationship as human
being with new technologies, and the wording that
the technological form of human experience is becoming
more pervasive every day. In the
distinction between natural and artificial,
biological and technological, the criteria
by which to discern what is human and
technique become more and more difficult.
The question is cultural.
It is necessary to reaffirm
the importance of the concept of personal conscience
as our relational experience,
which cannot disregard either
corporality or culture. In other words, in the
network of relationship, both subjective
and community.
Technology cannot replace human contact.
The virtual cannot replace the real,
and neither can social media replace
the social sphere should
support but not replace. And we are
tempted to make the virtual prevail over the real.
This is an ugly temptation.
Also, the pope worked out with
just a sentence that is saying that is
good. The technology continued to overcome eminently
approaches to contribute to the definition of
a new omenism and to encourage mutual
listening and mutual understanding between science,
technology and society.
The lack of constructive dialogue between these realities
in fact impoverishes the mutual trust
which is the basis of all human coexistence
and of all forms of social friendships.
So you want to replace the human being
in the middle and not just the
race for technology in the middle. And that is the core
of ethics. Like we have said at the beginning,
what we can do as organization, it is important that
we manage to maintain a common guard between various
elements inside the
development of activities. Whatever is the organization and
we should in some way support
the possibility to
prevent problem. So every
organization is supposed to think
about all the different aspect
once they are facing a new development
that imply the use of new technologies. Quantum computing
artificial intelligence, that is the
introduction of somebody, let's call them digital
ethics officer that has a large role
because it should cover multiple aspects,
have a deep knowledge and understanding of the company's digital
process, clearly identify exposure and ethical
risk of projects under development organize
and lead the operational governance for the supervision
of the human relationship aspect of artificial intelligence
project, including quantum computing utilization.
Because in an organization, whatever it is human
being remains relevant.
Should be the ethical reference for all digital process.
Considering the fact that there are so many evolution
in regulations,
approach sensibility in the society,
have a more strength in the
traversality of the reference and people involved in digital
project, not just the developers.
There should be a culture of cooperation
and collective decision to obtain coplanning and
transparency within the organization is
a role that should make
sensible the organization and advise the different organisations
entities of ethical impacts.
Possibly define a strategy ethical strategy
develop evaluation and control tools inside
the framework of internal and external
rules eliminate or minimize the ethical
risk of digital projects integrate and improve
the synergy of devices to protect the fundamentals rights
of consumer and citizen organize
internal reflection all the above requires
that the digital ethics officer
have a great human, scientific and practical
qualities to be able to instill the confidence necessary for the activism
guidance. It's just a suggestion that came out in
a workshop I attended as well last year in Dublin.
And in this
last chart you may see the challenges that is going to face.
Can be one person, more than one person, a team can
be something that is embedded in the way
to work of the different people independently, that they
are developers or not of technological project. But it should be
something that is embedded in the culture of the
organization is working on these kind of things.
As a conclusion,
let's say that we should start preparing today for tomorrow's quantumatics.
The different stakeholders should start thinking through the
potential challenges and understand how quantum computing
may create ethical risk in the future.
There are existing ethical frameworks for understanding the impact
of technology and many of the key considerations are
generic to quantum computing and strategy.
Enterprises should convene internal
leaders and experts to determine trigger events.
Such a new technological advancement or action
by a competitor that will defines
the need to act or increase investment approaches
to ethical risk mitigation should be part of developing
a quantum technology strategy. Quantum computing
promises to be extremely powerful.
Now is organization's opportunity to
potentially avoid the kind of ethical
pitfalls that the move
fast and break things
are era left behind.
We are at the beginning of a journey. It's exciting,
but with some risk.
It's normal for every trip.
So I let you my reference
in case you want to contact me to expand
better these kind of things.
I share that there are several. I mentioned the one
I'm collaborating with organization non profit that
are sensible on ethics that are working on
these kind of things. In particular, I'm advisor in
an historical cultural association in Italy that
promote and encourage the study of technology
and ethical development. In the use of that specifically
for artificial intelligence in Italy
and UN and France is active Europea
I'm a member of that is promoting
the use of good artificial intelligence.
I'm also part of a free association of expertise
that supports individuals in skill upgrade.
I'm in the area of technology. There is also a branch
on crafts and arts that to facilitate people
to learn how
the new technologies may help in making
their life better without risking a
bad effect.
I thank you all the people that has
been patient till now to listen and
I will invite you to expand
and follow all this evolution because
are really important for the
entire society. Thank you again.