Skip to content
Menu
LearnThings.Online
  • News
  • Coures
  • Search
LearnThings.Online

FinTech Ethics and Risks

Posted on March 6, 2020March 10, 2020

This course is from edX, scroll down & click “Read More” for more informations.

About this course

FinTech has started a global revolution in the financial services industry, and the transformation will only increase in coming years. There are many ways in which FinTech can improve the lives of people around the world; however, those same technologies can also be used to enslave, coerce, track, and control people. Accordingly, it is appropriate and necessary to consider the implications of the introduction of these technologies so that they are utilized properly, regulated sufficiently, and their adoption does not come at the expense of societal growth.

This 6-week online coursecovers 6 modules, representing the full spectrum of finance, technology, and the introduction of FinTech solutions globally. We will ask questions that are not often asked or addressed when new technologies are adopted. Why should we adopt FinTech solutions, and what are the best ways to introduce disruptive technologies? How does blockchain technology change the way we provide financial services, and how should blockchain technology be governed? Is FinTech creating risks in cybersecurity and how can technology help us prevent financial crimes? As Artificial Intelligence (AI) is developed and adopted, will human biases and prejudices be built into such mechanisms? And at a larger scope, should FinTech lead to a decentralized, democratized system of finance, or will existing institutions adopt FinTech strategies to cement their existing hold on the financial markets?

Through discussing and attempting to answer these questions, you will understand better how the introduction of these technologies can benefit or harm society. And through considering the proper application or introduction of such technologies, you will learn to make better decisions as an individual and organization when facing the question: is FinTech our savior or a villain?

What you’ll learn

  • Understand the ethical elements of finance, emerging technologies, and FinTech.
  • Identify trends and opportunities that will shape the future of FinTech.
  • Critically examine implications of Artificial Intelligence (AI), blockchain and cryptocurrencies (including ICOs).
  • Understand how Regulatory Technology (RegTech) enhances supervision and reduces compliance-related costs.
  • Understand how payment solutions are evolving and the potential ethical implications.
  • Understand how alternative financing, including crowdfunding and P2P lending, are impacting markets.
  • Analyze positive and negative aspects of the introduction and expansion of FinTech.

Syllabus

Introduction: Ethics of Finance and Emerging Technologies
This module will provide a historical and broad perspective of ethical issues relating to finance and the introduction or adoption of emerging technologies.

Blockchain and its Governance
This module will expand off the Introduction to FinTech course (https://www.edx.org/course/introduction-to-fintech), to consider the most relevant and ethical ways such technology should be implemented, in a number of different industries or product segmentation. In particular, data collection, customer privacy, and transactional issues will be covered in this module.

Cybersecurity & Crimes
FinTech can make it easier and cheaper for banks to monitor and control financial transactions, thus reducing fraud and reducing bank costs. But at the same time, these tools can be used to steal money and other corporate secrets, hide illegality (including purchases of weapons, drugs, etc.), and finance terrorists and other criminal organizations. Accordingly, this module will consider the implications of such important issues.

AI & FinTech
In this module we will consider the implications of building our own concepts of “human” morality into amoral machines, as well as a consideration of whether human biases and prejudices can or will be built into such mechanisms, whether purposefully or unintentionally.

Institutionalization vs. Decentralization
One of the key reasons people are calling for FinTech is for its decentralized nature, thus democratizing finance, and allowing regular people to participate more fully and affordably in financial transactions through technologies like cryptocurrencies, non-government issued IDs, and P2P lending. In this module we will address some large questions, considering whether FinTech should lead to a decentralized, democratized system of finance. Or whether existing institutions will adopt FinTech strategies to cement their existing hold on the financial markets.

Big Questions Relating to the Introduction of FinTech
In this final module, we will consider some of the many outstanding questions and purposes of introducing FinTech to the world, exploring the many ways that FinTech can both help and hurt society. We will discuss financial inclusion, sustainable development, and many other positive aspects of FinTech development. Conversely, we will also consider how these same technologies and solutions could potentially be used to inhibit access to financial markets, or worse.

Welcome and Course Administration

Welcome to FinTech Ethics and Risks

  1. Hey, listen.
  2. We have a decision to make.
  3. Humanity stands on the edge of a massive shift
  4. in technology and productivity
  5. that is going to fundamentally alter our lives.
  6. Blockchain, big data, artificial intelligence,
  7. these buzzword technologies are rapidly changing our world,
  8. just like the steam engine that started
  9. the Industrial Revolution.
  10. Over the past century,
  11. new technologies have changed
  12. how we work and even how we define work.
  13. During that time,
  14. the average number of hours worked
  15. has steadily declined in the developed world,
  16. but lifestyles have generally improved.
  17. And the technologies on the horizon
  18. look to completely alter society as we know it.
  19. So here’s the cool part.
  20. If we get the next 10 years right,
  21. humankind could be well on its way to reaching
  22. the type of Utopian existence characterised
  23. in many stories about the future.
  24. This is especially true
  25. in the area of financial technology.
  26. And while maybe not as sexy or cool
  27. as driverless cars, advancements in FinTech
  28. will make it easier to send, receive and invest money.
  29. These are at the core of business and commerce,
  30. and FinTech stands to alter these interactions completely.
  31. But before we dive headfirst into this brave, new world,
  32. it’s critical to ask a few questions.
  33. Like, why?
  34. Why is blockchain technology necessary?
  35. Is faster, cheaper, smarter
  36. always better when it comes to data?
  37. What unintended consequences will arise
  38. from introducing artificial intelligence into everyday life?
  39. You see, unlike the steam engine,
  40. the magic of these new technologies
  41. is that they can scale faster than ever
  42. and quickly engulf the entire world.
  43. And while they may have
  44. the power to unite and transform,
  45. they can also be used to bind and control.
  46. Advancements in technology over the next decade
  47. will certainly lead to massive job loss
  48. and many fear new forms of slavery, surveillance and crime.
  49. Now is the time for us to consider
  50. what we want and what we will allow.
  51. We can’t wait until these new technologies
  52. are fully developed.
  53. Once we push play, we can’t just rewind.
  54. If we don’t talk about it now, it will be too late.
  55. This course is a chance for us
  56. to consider these questions together.
  57. Through it we hope to explore the implications
  58. for us individually and collectively.
  59. We live in a world where distance is relative
  60. and resources are growing scarcer,
  61. where local problems now have global implications.
  62. Humanity may stand on the edge,
  63. but we stand on it together.
  64. So join us as we consider these tough questions
  65. and help shape our collective future.
  66.  

Course Outline

FinTech Ethics and Risks is a six-week, six-module course. Each weekly module compiles 5-7 sections, consisting of 15-20 learning units. In each learning unit, there is a short lecture video, followed by learning activities such as Quick Check questions, Polling, Word Cloud and Discussions. In addition, there is a range of additional resources provided, including research papers, news articles, industry reports, and useful links. There is a Conclusion Quiz at the end of each module.

Discussion is a very important part of your learning experience in this course. The course instructors will post questions and discussion prompts under each topic, and selectively comment on your responses. By the end of the course, active participants will also be invited to be discussion moderators and community TAs for the next course cohort.

Course Outline:

Module 1: The Ethics of Finance

This module will provide a historical and broad perspective of ethical issues relating to finance and the introduction or adoption of emerging technologies. 

Module 2: Blockchain and Its Governance

This module will expand off the Introduction to FinTech course, to consider the most relevant and ethical ways such technology should be implemented, in a number of different industries or product segmentation. In particular, data collection, customer privacy, and transactional issues will be covered in this module. 

Module 3: Cybersecurity and Crimes 

FinTech can make it easier and cheaper for banks to monitor and control financial transactions, thus reducing fraud and reducing bank costs. But at the same time, these tools can be used to steal money and other corporate secrets, hide illegality (including purchases of weapons, drugs, etc.), and finance terrorists and other criminal organizations. Accordingly, this module will consider the implications of such important issues. 

Module 4: Artificial Intelligence & Fintech

In this module we will consider the implications of building our own concepts of “human” morality into amoral machines, as well as a consideration of whether human biases and prejudices can or will be built into such mechanisms, whether purposefully or unintentionally. 

Module 5: A Decentralized Future

One of the key reasons people are calling for FinTech is for its decentralized nature, thus democratizing finance, and allowing regular people to participate more fully and affordably in financial transactions through technologies like cryptocurrencies, non-government issued IDs, and P2P lending. In this module we will address some large questions, considering whether FinTech should lead to a decentralized, democratized system of finance. Or whether existing institutions will adopt FinTech strategies to cement their existing hold on the financial markets. 

Module 6: Positive Impact of FinTech

In this final module, we will consider some of the many outstanding questions and purposes of introducing FinTech to the world, exploring the many ways that FinTech can both help and hurt society. We will discuss financial inclusion, sustainable development, and many other positive aspects of FinTech development. Conversely, we will also consider how these same technologies and solutions could potentially be used to inhibit access to financial markets, or worse.

Module 1 The Ethics of Finance

1.0 Course Introduction

  1. Fintech Ethics and Risks.
  2. We are excited to embark on this
  3. learning journey with you,
  4. and we genuinely believe that the principles
  5. we will explore together are at the heart of
  6. one of the great debates that humanity will need to
  7. address in our lifetimes.
  8. Over the last few months,
  9. as we have prepared this course,
  10. this reality has become even clearer to us.
  11. Advances in technology, especially those related to
  12. financial technologies or “FinTech”,
  13. are already starting to impact us and will
  14. eventually become so pervasive that they will
  15. become a core part of our existence.
  16. Because of that, we felt compelled to teach this course,
  17. in order to collectively consider, with you,
  18. key principles and questions about the nature of
  19. how we want to manage technological change
  20. as it intersects with our lives.
  21. In developing new technologies, it is quite clear
  22. that perhaps the key focus is whether
  23. something can be developed or created,
  24. in essence captured by the question,
  25. “Can we do it?”
  26. This important question has been an engine
  27. that has driven human progress and technological
  28. advancement.
  29. There is, however, another equally critical question
  30. that is usually not asked, which is:
  31. “Should we do it?”
  32. This question is incredibly important because
  33. it forces us to consider the impact of new
  34. technologies at their genesis, and not when it’s too late
  35. or too difficult to mitigate negative aspects of the
  36. technology that were not initially considered.
  37. So at its core, this course is about considering
  38. the impact of new technologies,
  39. especially FinTech, before they are so mature
  40. and embedded that they cannot be managed.
  41. To kick-off our journey, we will first consider
  42. the history of finance and its role in society
  43. before moving on to an interesting case study
  44. about the financial institution, Wells Fargo.
  45. Then we will lay out five key principles
  46. that frame the course.
  47. These five principles are:
  48. trust, proximity,
  49. accountability, cultural lag, and privacy.
  50. And then we will return to each of these principles
  51. repeatedly through the rest of the modules.
  52. Lastly, while the nature of ethics sometimes requires
  53. an exploration of the dark realities of life,
  54. please don’t mistake that for our lack of enthusiasm
  55. about the future.
  56. If thoughtfully managed,
  57. we believe FinTech is a key to a utopian future
  58. where society is more fair, just, and inclusive.
  59. Thank you again for joining this journey.
  60. It’s important because we have a collective choice to make about that future.
  61.  

1.1.1 What Is Money?

  1. Before we begin exploring fintech in greater detail,
  2. let’s take a few minutes to consider
  3. the history of finance,
  4. and what purpose it plays in society.
  5. To do this we will consider three questions:
  6. One, what is money?
  7. Two, how do we value money?
  8. And three, why do we have banks?
  9. Answering these questions will help us understand
  10. the rise of fintech,
  11. and the moral underpinnings
  12. that make up the foundation of the industry.
  13. And for those of you in the finance space,
  14. please bear with us for a moment.
  15. This course is being taken by diverse people
  16. from all over the world,
  17. and is going to cover some pretty complicated ideas.
  18. We need everyone to have a clear foundation
  19. on some of the major principles so that we
  20. can move into some of the more advanced concepts.
  21. Now in reality, society is at a stage where we all should
  22. take a step back and review
  23. the nature of the finance industry.
  24. So whether you’re new to finance,
  25. or a savvy industry veteran, let’s revisit these
  26. foundational principles together.
  27. A few weeks ago while I was walking my 7-year old
  28. daughter Lola to school,
  29. she asked me a question that caught me completely
  30. off guard.
  31. She looked up to me as we were walking hand-in-hand
  32. and said: “Daddy, what is money?”
  33. I was confused by the question, and started mumbling
  34. something about bartering, and working
  35. and that we use money to represent value.
  36. But no matter what I said,
  37. she just kept repeating “that doesn’t make any sense,
  38. that doesn’t make any sense.
  39. Money is just paper and is not worth anything.”
  40. Well, what I failed to explain to her 7-year old mind
  41. is a key lesson of finance upon
  42. which much of society is built:
  43. that the value of money is a social construct
  44. built on trust.
  45. Let’s look at this another way:
  46. close your eyes and imagine you were just
  47. given a million dollars.
  48. And really, close your eyes – just trust me for a minute.
  49. Now, picture it.
  50. Really try to think about what you could buy
  51. for a million dollars.
  52. But now wait a minute – I didn’t tell you the
  53. currency of the million dollars.
  54. Think about how different that consideration would be
  55. if the currency was US dollars versus
  56. Hong Kong dollars, or some other type of dollar.
  57. As you probably know, the value of money fluctuates
  58. based on the relative value of the currency.
  59. And this calculation also changes
  60. depending on the time:
  61. so a currency may be more or less valuable today
  62. than it was yesterday.
  63. This has been starkly evident when observing
  64. the massive fluctuations of cryptocurrencies
  65. like Bitcoin over the past few years.
  66. Okay, one last consideration:
  67. close your eyes again and envision what you can buy
  68. with US $1 million dollars.
  69. Try to picture it.
  70. You could buy a nice home, a fancy sports car,
  71. or finance a trip around the world several times over.
  72. Now, picture that pile of cash
  73. and what it would look like.
  74. Maybe even consider throwing it out over a bed
  75. and just rolling around in it for a while.
  76. Now imagine you are stuck on a deserted island.
  77. You are starving, thirsty, maybe scared.
  78. You have that same pile of money –
  79. but what can it buy you now?
  80. Are you going to be able to negotiate with
  81. the apes for some of their bananas with that money?
  82. In that context, the dollars might be more valuable
  83. as kindling to help you start a fire!
  84. Or imagine if a small boat pulled up to the island
  85. with the ability to rescue you, but the price
  86. of your rescue was the entire one million dollars.
  87. Would you pay it?
  88. Okay, so what’s the point?
  89. We share these stories because before we get
  90. too far in this course,
  91. we need you to understand a couple of things.
  92. The first thing is that value is subjective.
  93. And when we say value, we are referring to
  94. both material value – which is the value that
  95. we ascribe to goods and services – but also
  96. the value that we place on morality
  97. and personal connections.
  98. The second thing that we need you to understand
  99. is that the very concept of money is largely
  100. a social construct, something we have invented
  101. as a medium of exchange and to which we have
  102. prescribed a specific value.
  103. As my daughter Lola noticed at 7 years old,
  104. in a vacuum money by itself isn’t really worth anything.
  105. And when stuck alone on an island,
  106. your banknotes carry little value.
  107. So if currency by itself is essentially without value,
  108. then why is it so important and coveted so highly?
  109. To understand that, we have to go back a few centuries.

1.1.2 How Do We Value Money?

  1. So if currency by itself is essentially without value,
  2. then why is it so important and coveted so highly?
  3. Let’s do another thought experiment
  4. to explore the answer.
  5. This time imagine you live in a small European town
  6. maybe 250 years ago.
  7. You can select your occupation – maybe a blacksmith
  8. or cobbler, or, more commonly at the time, a farmer.
  9. Back then most economic enterprises were small family
  10. enterprises and everyone knew each other intimately,
  11. and a lot of transactions were based
  12. on a barter system.
  13. So if you raised chickens, you could trade your eggs
  14. for whatever else you needed.
  15. For example, if you wanted rice, you would need to find
  16. a rice farmer who had spare rice to sell –
  17. and who wanted eggs – and then agree on how much
  18. rice an egg would get you.
  19. Back then, most transactions were proximate,
  20. meaning they were directly between two people
  21. meeting in person.
  22. In that type of proximate, one-on-one scenario
  23. where you both live in the same small town,
  24. deceptive sales practices are far less likely
  25. because merchants relied on their good name.
  26. As you can imagine, it would be pretty awkward
  27. if you cheated someone since you would have
  28. to continue running into each other in your small town.
  29. As a result, this type of personal connection to
  30. the community significantly increased the trust
  31. within the marketplace.
  32. So as you probably can understand,
  33. this type of proximate, one-on-one barter system
  34. is fairly limited.
  35. If you were a teacher, for example,
  36. and you are providing education,
  37. then how many eggs
  38. is an hour of education worth?
  39. So over time money was introduced as a common
  40. medium of exchange, making trade easier and
  41. giving rise to service industries
  42. and other knowledge-based professions.
  43. But here’s the challenge:
  44. let’s say you start to receive currency for your eggs
  45. rather than bartering for rice or other goods.
  46. How can you ensure the value of that currency?
  47. Imagine if for your entire life
  48. you’ve been bartering for goods,
  49. and now instead someone wants to hand you a piece of
  50. paper and say that it is the equivalent value
  51. of that particular good.
  52. As we discussed earlier, in a vacuum the currency
  53. in your wallet has little inherent value.
  54. Its value exists simply because we have decided
  55. that it does – and society has decided to use it
  56. as a medium of exchange.
  57. As my daughter Lola said,
  58. that really doesn’t make any sense!
  59. Well actually it does,
  60. but only when based on a very broad social construct
  61. founded on trust.
  62. We trust that the money we hold in our hands today
  63. will have some meaningful value tomorrow.
  64. Even in today’s much more complicated marketplace,
  65. trust remains the basis of our monetary system.
  66. Our money today is not backed by
  67. a physical commodity like gold.
  68. It only has value because governments have declared
  69. it to be a legal tender,
  70. which is what we often call fiat money,
  71. and people believe or trust
  72. that such status will continue.
  73. If you remove trust from the financial system,
  74. then the entire thing crumbles,
  75. as we have seen happen during financial crises
  76. around the world
  77. like the one that is sadly occurring in Venezuela
  78. right now.

Additional Readings

  • Silver, L., Johnson, C., & Taylor, K. (2019). Venezuelans Have Little Trust in National Government, Say Economy is in Poor Shape. Pew Research Center. Retrieved from https://www.pewresearch.org/fact-tank/2019/01/25/venezuelans-have-little-trust-in-national-government-say-economy-is-in-poor-shape/

1.1.3 Why Do We Have Banks?

  1. Okay, so we’ve done some interesting
  2. thought experiments related to money
  3. and how we value money,
  4. and particularly about how our financial system
  5. is based on trust.
  6. But then what is the “financial system”?
  7. What does that even really mean?
  8. Finance and the financial system largely refer to
  9. the services related to the management of money.
  10. And, as mentioned earlier, this often requires
  11. a relationship of deep trust,
  12. or what we would call today
  13. a “fiduciary relationship.”
  14. And if our trust in money is largely a social construct,
  15. our trust in the financial industry
  16. is largely an economic and legal construct.
  17. In other words, we rely on contracts and the law
  18. to enforce our rights,
  19. rather than intimate social relationships as was
  20. common back in Europe 250 years ago.
  21. Although finance is made of many types of institutions,
  22. banks have always been at the
  23. heart of the industry.
  24. So let’s take a minute and
  25. explore the traditional purpose of banks.
  26. Why do we even have them?
  27. When I think about a bank,
  28. the first thing that comes to mind is a physical location
  29. where people go to deposit money
  30. and withdraw money.
  31. Or to put it really simply,
  32. it’s the place you stick your money to keep it safe.
  33. But this is only a part of what banks are for.
  34. During the industrial revolution,
  35. the traditional feudal system broke down
  36. and new industries
  37. started popping up all over the world.
  38. This led people to start moving away from
  39. farming jobs and into manufacturing and service roles,
  40. which for many families meant that
  41. they had discretionary income for the first time.
  42. As a result,
  43. if they didn’t want to hide it under their mattress,
  44. they needed a safe place to keep the money.
  45. And with the rise of entrepreneurship
  46. and companies during that era,
  47. it also meant that many people sought loans
  48. for starting businesses, buying homes,
  49. and other consumer necessities.
  50. As a result, the financial industry really started to thrive,
  51. and banks popped up all over the place.
  52. These banks served four primary functions,
  53. which really haven’t changed that much even until today.
  54. First of all,
  55. banks give people a way to save money safely.
  56. This makes sense – I’m sure you’ve seen a TV show
  57. or movie that included a bank heist
  58. where criminals broke into a vault.
  59. The vaults have huge doors, thick walls,
  60. complex security systems, and most importantly –
  61. lots and lots of cash, gold, and other valuables.
  62. And although some of this has changed,
  63. especially with a lot of currency
  64. and banking have become cloud-based
  65. and hosted on servers rather than in vaults,
  66. security is still the number one reason why so people
  67. use banks to hold their money.
  68. The second traditional function of banks
  69. revolves around financing.
  70. As economies started changing,
  71. people started exploring new uses for credit and capital,
  72. such as what many people carry around in their pocket
  73. – the credit card.
  74. Now this is a form of financing.
  75. The expansion of consumer credit
  76. has been the key driving force in enabling many people
  77. to move from low-income
  78. to middle class around the world,
  79. and traditionally banks have been the best source of
  80. consumer credit.
  81. The third traditional function of banks
  82. is to facilitate investments.
  83. So, without getting too complicated,
  84. let’s use a simple example.
  85. Let’s say one day you receive your paycheck
  86. – and after paying all your bills
  87. you have some money left over.
  88. And you decide that you want to save that money.
  89. But instead of saving that money in your bank account,
  90. you say: “hey, I want to buy a mutual fund,
  91. or I want to invest in the stock market by purchasing
  92. shares in a company that I like.”
  93. Banks are at the core of that type of investment activity,
  94. and that is an important role they play in society.
  95. The fourth and final traditional function of banks
  96. revolves around providing financial advice –
  97. and often helping companies or individuals make the
  98. best use of the money they have at their disposal.
  99. Because of these four reasons,
  100. banks have been trusted community partners
  101. for centuries,
  102. and one of the key reasons for the rise
  103. of the middle class throughout the world.
  104. But are things starting to change?

Additional Readings

  • Scott, B. (2017). Hard Coding Ethics into Fintech. Finance & The Common Good. Retrieved from http://www.ethicsinfinance.org/wp-content/uploads/2018/01/Brett-Scott-Hard-coding-ethics-into-fintech.pdf  

1.1.4 The Loss of Trust in Financial Institutions and the Rise of TechFins

  1. As we mentioned, finance is largely built on trust.
  2. And in the past, banks served as the guarantors of trust
  3. in the financial world.
  4. But trust in financial institutions has diminished
  5. pretty significantly in many countries
  6. over the past decade.
  7. This of course is largely due to
  8. the Global Financial Crisis,
  9. which affected millions of people around the world.
  10. I remember that time vividly.
  11. Back then I was still practicing law full time
  12. in Hong Kong, one of the world’s financial centers.
  13. So most of my friends, colleagues, and clients
  14. were deeply affected by the crisis.
  15. Even so, we were only able to watch as the near collapse
  16. of the global financial system occurred.
  17. David Lee and I, we watched many of our friends
  18. as they were terminated from jobs with little warning.
  19. We had a daily reminder of how flippantly
  20. certain members of the global financial community
  21. pursued profits at the expense of their customers,
  22. and raising concerns that government regulators
  23. were not adequately supervising the financial industry.
  24. The crisis and its aftermath highlighted
  25. how the financial system and banks failed to perform
  26. some of the chief roles they were meant to perform
  27. for our society,
  28. particularly in managing risk and allocating capital.
  29. Millions of people around the world lost their homes,
  30. their savings, essentially their futures.
  31. In the US alone, it was estimated that
  32. American households lost $20 trillion dollars in wealth
  33. as a result of the Financial Crisis.
  34. And as a result, it might not surprise you that
  35. many people began to distrust the very institutions
  36. that were meant to protect and serve them.
  37. And the age-old characterization of bankers as greedy,
  38. selfish, short-sighted, bloodsuckers
  39. returned in full force.
  40. Let’s be honest: many large financial institutions
  41. have not done much since the Financial Crisis
  42. to reduce our concerns,
  43. with multiple high-profile scandals only helping to
  44. hasten the rise of FinTech innovations outside
  45. of the traditional financial sector.
  46. Over the past ten years,
  47. due in large part to a combination of the Financial Crisis
  48. and the advent of the smartphone,
  49. a major shift has occurred,
  50. characterized by the rise of what we call the TechFins –
  51. digital platforms like Facebook, Amazon, Google,
  52. and Tencent – that provide e-commerce,
  53. peer to peer lending, communications,
  54. and increasingly serve as the keepers of
  55. our digital identity.
  56. But after more than a decade of explosive growth,
  57. many of the TechFins are themselves
  58. embroiled in controversy,
  59. once again leaving customers wondering
  60. who they can trust.
  61. Data privacy breaches and little accountability
  62. have caused many people to question their use of these
  63. large technology platforms.
  64. But the fact remains:
  65. people still need financial services.
  66. So who will step up as the trusted partners of the future?
  67. Of our future?
  68. Let’s consider that question as we dive
  69. into our first case study.

Additional Readings

  • Buckley, R. (2016). The Changing Nature of Banking and Why It Matters. In R. Buckley, E. Avgouleas, & D. Arner (Eds.), Reconceptualising Global Finance and its Regulation (pp. 9-27). Cambridge: Cambridge University Press. doi:10.1017/CBO9781316181553.002

1.2.1 Case Study – Wells Fargo

  1. Banks have used the past decade since
  2. the financial crisis to rehabilitate their image,
  3. some more successful than others.
  4. But one bank has recently gone above and beyond
  5. in reigniting the general public’s disdain
  6. towards financial institutions.
  7. If banks are built on the foundation of consumer trust,
  8. Wells Fargo has systematically dismantled that trust
  9. leading to an uncertain future for that institution.
  10. Wells Fargo has a really interesting history.
  11. It was established in 1852 in San Francisco
  12. during the gold rush,
  13. and as a result has long been an integral part
  14. of the American financial landscape.
  15. When gold was discovered in California, Wells and Fargo –
  16. two entrepreneurs – decided to provide services
  17. relating to the transport and safe keeping of gold dust,
  18. gold coins, salaries, and other critical resources
  19. all across the US Western frontier.
  20. You may have seen before,
  21. but the stagecoach is actually the logo or symbol
  22. for Wells Fargo bank.
  23. Before the advent of railroads,
  24. stagecoaches were considered the safest
  25. and most reliable form of transportation for people
  26. and valuables across the dangerous deserts
  27. of the Southwest United States.
  28. This was the age of the American cowboy,
  29. and those stagecoaches were the targets of some
  30. of the most notorious bandits of the time.
  31. You have probably seen movies with this type of a scene
  32. – where a stagecoach driver and a guard sit on a seat.
  33. They usually carried sawed-off shotguns and revolvers,
  34. and often had to fight their way
  35. past bandits in the rugged terrain.
  36. Anyway, this is important because once again,
  37. the crux of the entire business model was
  38. based off of trust.
  39. Trust that the Wells Fargo coach drivers
  40. wouldn’t steal the gold dust and bars they were carrying.
  41. Trust that the stage coaches and the roads built
  42. would provide reliable transit to ensure
  43. payment of railroad employees.
  44. Wells Fargo was so trusted by the railroad tycoons
  45. that it quickly established the largest fleet of
  46. stage coaches in the world,
  47. helping to build one of the oldest and largest banks
  48. in the United States,
  49. eventually employing more than 200,000 people
  50. globally.
  51. In September 2016,
  52. news emerged that employees at Wells Fargo,
  53. the world’s most valuable bank in the world at the time,
  54. had created millions of fake bank and credit accounts
  55. that customers had never authorized.
  56. Due to a high-pressure sales culture and an
  57. incentive-compensation program for employees
  58. to create new accounts,
  59. Wells Fargo employees had engaged in an array of
  60. immoral practices,
  61. such as fraudulently opening accounts,
  62. issuing ATM cards and assigning PIN numbers,
  63. faking signatures and using false email addresses.
  64. Customers had subsequently been hit with late fees,
  65. overdraft charges, annual fees, and other costs –
  66. all of which could affect their credit scores.
  67. When customers noticed the charges,
  68. employees would apologize and lie, saying oh,
  69. there just had been an administrative mistake.
  70. This dishonest program was based on the internal goal
  71. of selling at least eight financial products
  72. to each customer,
  73. or what Wells Fargo called the “Gr-eight initiative.”
  74. These products included credit cards, savings accounts,
  75. investment accounts and more.
  76. Why eight you may ask?
  77. Because eight rhymed with great!
  78. No joke, that’s what they decided:
  79. the CEO said “because eight rhymes with great,”
  80. therefore they arbitrarily decided that each customer
  81. should have 8 accounts with the bank.
  82. Selling different accounts to bank clients
  83. is commonly known as cross-selling.
  84. Basically,
  85. if you go to a bank and open a savings account,
  86. they might ask you to open a checking account,
  87. or buy an insurance plan.
  88. This is called cross-selling,
  89. and they wanted the average Wells Fargo customer
  90. to have 8 such accounts.
  91. Why?
  92. Well, in part because it allowed the bank to
  93. make more money in fees.
  94. But to be honest, the fees were minimal
  95. and Wells Fargo didn’t really make much money
  96. of off them.
  97. So then why would they do it?
  98. Why did the bank put so much pressure on their staff
  99. to cross-sell and push 8 accounts that managers
  100. across the bank started creating fake accounts?
  101. The reason is because Wall Street analysts
  102. used data like “new accounts opened”
  103. as a key metric when evaluating a bank’s share price.
  104. That means,
  105. the more customer accounts Wells Fargo can show,
  106. the higher their stock price went,
  107. even if Wells Fargo really wasn’t making
  108. any additional money.
  109. And when analysts saw all the new customer accounts,
  110. the share price for Wells Fargo
  111. doubled between 2012 and 2015.
  112. And who makes money when the share price goes up?
  113. Well shareholders will,
  114. but in particular
  115. the executives and directors of the company
  116. who are compensated primarily in stock options.
  117. So, in other words, even though Wells Fargo
  118. wasn’t making more money,
  119. or serving its customers better,
  120. the value of the shares doubled making a lot of money
  121. for the bank’s executives
  122. – the very people who created this horrible practice
  123. in the first place.
  124. The high-pressure sales culture
  125. created by Wells Fargo bank executives,
  126. where you could face getting fired if not hitting
  127. the cross-selling goals,
  128. created a toxic environment that pushed employees
  129. to fear for their jobs and make bad ethical choices,
  130. all while management turned a blind eye to the practice.
  131. The program finally became public years after
  132. Wells Fargo’s management knew about the problem.
  133. When asked why he didn’t notify government officials
  134. as soon as he learned about the problem,
  135. then-CEO John Stumpf said,
  136. that the amount of money made by Wells Fargo from
  137. the program was immaterial to the bank’s size –
  138. and thus not important.
  139. Of course, this incensed the public and lawmakers alike,
  140. and they demanded action.
  141. So what did Wells Fargo do?
  142. Well,
  143. they didn’t replace any of their senior management.
  144. Instead, they terminated nearly 5,300
  145. mid-level employees, stating it was their fault
  146. for making up all the fake accounts.
  147. Not a single top level executive was fired at that time.
  148. Once again, this did not seem sufficient to the public
  149. and US lawmakers.
  150. US senators grilled Wells Fargo’s top management,
  151. and the media carried story after story detailing the
  152. bank’s actions – or perhaps more accurately, inaction.
  153. After mounting pressure, then CEO John Stumpf
  154. stepped down, as did did Carrie Tolstedt,
  155. the head of the community banking division
  156. at Wells Fargo.
  157. But don’t feel too bad for either of them.
  158. For example, when Ms. Tolstedt left Wells Fargo
  159. she received about $125 million USD
  160. equity compensation as a retirement package.
  161. All in all, Wells Fargo had engineered what one analyst
  162. described as a “virtual fee-generating machine,
  163. through which its customers were harmed,
  164. its employees were blamed,
  165. and Wells Fargo [and it’s executives] reaped the profits.”
  166. In light of the scandal, Wells Fargo and its new CEO,
  167. Tim Sloan – who was the bank’s former COO
  168. – emphasized that they would initiate refunds
  169. “as part of [their] ongoing efforts to rebuild trust.
  170. But Wells Fargo’s problems didn’t end there.
  171. Their unethical internal culture had permeated
  172. several of their businesses,
  173. leading to a string of scandals and investigations.
  174. For example: In July 2017,
  175. Wells Fargo admitted to forcing up to 570,000 borrowers
  176. into unneeded auto insurance.
  177. Reports also emerged that 110,000 customers had been
  178. incorrectly charged “mortgage rate lock extension fees”
  179. between September 2013 and February 2017.
  180. And last year news also emerged that a computer glitch
  181. at Wells Fargo caused hundreds of people
  182. to have their homes foreclosed on
  183. between 2010 and 2015.
  184. As a consequence of these numerous scandals,
  185. the Federal Reserve announced on February 2, 2018
  186. that Wells Fargo would not be allowed to grow its assets
  187. until it cleared up its act.
  188. An unprecedented punishment.
  189. In May 2018, Wells Fargo launched a marketing
  190. campaign to emphasize the company’s commitment
  191. to re-establishing trust with its stakeholders.
  192. The commercial opens with the Old West origins
  193. of the bank,
  194. depicting its transition from horse riding,
  195. the iconic stagecoach, the steam boat, the train, its branches,
  196. its ATMs, now and its mobile systems
  197. – portraying its whole technological journey.
  198. The video then goes on to make references to the
  199. scandals, and illustrating how it is now a
  200. “new day at Wells Fargo.”
  201. That new day and attempt to re-establishing trust,
  202. may have been another attempt in vain.
  203. Because, just a few months after, in August 2018,
  204. the Justice Department of the US government
  205. announced that Wells Fargo had agreed to pay
  206. a $2.1 billion fine for issuing mortgage loans
  207. it knew contained incorrect income information.
  208. The government said the loans contributed to the
  209. 2008 financial crisis that crippled the global economy.
  210. If trust is a key component for the financial system
  211. and banks,
  212. what does the experience of Wells Fargo tell us about
  213. the financial system today?
  214. Do you feel like the Wells Fargo example is an outlier,
  215. and that most of the financial industry today
  216. can be trusted?
  217. Or, are you skeptical about the ethics
  218. of the broader industry as a whole?

Additional Readings

  • Egan, M. (2017). Wells Fargo Dumps Toxic ‘Cross-Selling’ Metric. CNN. Retrieved from https://money.cnn.com/2017/01/13/investing/wells-fargo-cross-selling-fake-accounts/index.html 
  • Wells Fargo Just Got Hit With Another Penalty for the Financial Crisis. This Time, It’s $2.1 Billion. Fortune. Retrieved from http://fortune.com/2018/08/01/wells-fargo-financial-crisis-fine-mortgage-backed-security/
  • Cavico, F., & Mujtaba, B. (2017). Wells Fargo’s Fake Accounts Scandal and its Legal and Ethical Implications for Management. S.A.M. Advanced Management Journal, 82(2), 4-19. Retrieved from https://search-proquest-com.eproxy.lib.hku.hk/docview/1926580720?accountid=14548 (paywall)
  • Merle, R. (2017).  Wells Fargo’s Scandal Damaged Their Credit Scores. What Does the Bank Owe Them? The Washington Post. Retrieved from  https://www.washingtonpost.com/business/economy/in-wake-of-wells-fargo-scandal-whats-to-be-done-about-damaged-credit-scores/2017/08/18/f26d30e6-7c78-11e7-9d08-b79f191668ed_story.html (paywall)

1.2.2 Case Study – Wells Fargo: Breach of Trust

  1. Okay, this is a crazy case that a lot of people
  2. in the financial industry were really,
  3. really concerned about.
  4. So why is this case so important?
  5. I mean,
  6. there seems to be a lot of financial crime out there,
  7. people not doing great things all the time
  8. – what made this particularly special?
  9. Yeah, it’s a good question because, again,
  10. the actual money that Wells Fargo made from
  11. this really wasn’t a lot, so in terms of financial crime
  12. it wasn’t that significant
  13. – and yet a lot of people were really upset about this.
  14. Some financial analyst even said that
  15. this was the worst financial crime ever.
  16. And I think the main reason is because,
  17. you know, for you out there, for me, I choose
  18. a bank solely because I need to know that
  19. I can trust them. Right.
  20. And here in this particular instance,
  21. they completely betrayed that trust and seemingly
  22. for completely selfish and greedy reasons.
  23. So, when you say selfish and greedy reasons.
  24. What do you mean by that?
  25. Well, again, there really was no benefit to the
  26. customer here.
  27. So again, when you open a bank account and
  28. you put some money there,
  29. you’re not anticipating that they are going to do all these
  30. shady things behind your bank; they are going to
  31. open up accounts, or make you get insurance,
  32. that you know nothing about.
  33. And in this particular instance, I feel like,
  34. it was just complete dishonest and betrayal of trust
  35. where there was no benefit to the consumers
  36. whatsoever.
  37. So they didn’t for example,
  38. they didn’t do any research and say customers are
  39. better off if they have 8 accounts, they simply said
  40. that 8 rhymes with great,
  41. and so therefore we’re gonna do this.
  42. Okay, so then who did benefit from this kind of activity?
  43. The senior staff, the CEO, various high-level people
  44. within the company, specifically those
  45. that had stock options for example.
  46. Because, again, even though, it’s very unique right,
  47. because the bank didn’t make very much
  48. money off the unethical behavior directly,
  49. the reason they made money is because their
  50. share price doubled within a short period of time,
  51. so they were able to sell off their shares and
  52. personally benefit significantly from this,
  53. but the bank itself didn’t actually receive
  54. a lot of remuneration.
  55. That’s interesting, so you’re saying that,
  56. from an economic perspective, the bank did
  57. not make any money from this?
  58. But somehow these extra accounts they created,
  59. increased the share price,
  60. because Wall Street analysts saw this as some sort of
  61. metric that the bank was growing. [Yeah, exactly.]
  62. And so, in terms of market value,
  63. it seems that it was increasing,
  64. but in terms of actual economic value there was
  65. actually no real value that was added by this behavior.
  66. So, basically the explanation is like this:
  67. the bank itself, when it does transactions,
  68. they make money out off it, just like you’d
  69. make money if you sell hamburgers or whatever.
  70. And the bank from these kind of unethical,
  71. even illegal, behavior, only made they think between
  72. 1.5 and maybe 2.5 million dollars
  73. from these transactions.
  74. But here’s the thing, their share price
  75. more than doubled, which means that the individuals that
  76. owned those shares including the CEO
  77. and various senior officials who were pushing
  78. this behavior, they made hundreds of millions of dollars
  79. collectively and they walked away
  80. with almost all of that.
  81. Now, there was some clawbacks,
  82. there were some issues where they had to give up some
  83. of that money, but again, they walked away
  84. although in disgrace, they walked away with
  85. 100s of millions of dollars.
  86. And how much were roughly the fines,
  87. and things that Wells Fargo had to pay because of this
  88. kind of behavior?
  89. Yeah, again, this is the terrible thing.
  90. Again, if you’re the customer of a bank,
  91. and you think that you want the bank to be
  92. led by people with integrity
  93. because you want insure that your investment is safe,
  94. here’s the rub:
  95. they individually made 100s of millions of dollars,
  96. and then when they left the bank in disgrace,
  97. the bank ended up paying 100s of millions of dollars
  98. in various fines and legal fees – potentially over a
  99. billion dollars more recently – where they are going
  100. to have to pay these massive fines.
  101. And that doesn’t even include the reputation loss,
  102. and so municipal governments, state governments,
  103. that completely removed their business
  104. from Wells Fargo, which means it made it impossible
  105. for them to continue growing – or, not impossible
  106. – but it’s certainly hurting their bottom line.
  107. And it was so bad that the federal government
  108. in the US actually kind of stopped their growth:
  109. saying, you gotta clean this stuff up because
  110. you’re not running this in a reputable way.
  111. So, seems like there is a tragic irony here that the people
  112. who at least allowed that behavior to occur,
  113. or at least on their watch, they were able to benefit from
  114. it and walk away,
  115. and the bank and the fines it has to pay,
  116. are really being borne by the current shareholders
  117. and the current other stakeholders, such as customers
  118. and employees – that have to deal with the fallout
  119. of all this.
  120. Yeah, and that includes all of you by the way.
  121. So, think about it, if you’re gonna use a bank,
  122. if you’re gonna use them for services
  123. – how would you feel if you knew that
  124. they betrayed your trust in that way.
  125. How do you kind of move on from that?

Additional Readings

  • Verschoor, C. C. (2016). Lessons from the Wells Fargo Scandal. Strategic Finance. Retrieved from https://sfmagazine.com/post-entry/november-2016-lessons-from-the-wells-fargo-scandal/ 
  • Volkov, M. (2018). Wells Fargo: Corporate Board Lessons Learned? Ethical Boardroom. Retrieved from https://ethicalboardroom.com/wells-fargo-corporate-board-lessons-learned/ 

1.3.1 Key Ethics Principle – Trust

  1. After learning about the Wells Fargo case,
  2. what were some of the underlying thoughts
  3. that you had about the case?
  4. Did the actions of the bank leaders surprise you?
  5. And would you trust Wells Fargo as your bank
  6. after learning what they did?
  7. You might be surprised to learn that some
  8. financial analysts said this was the worst
  9. financial scandal of all time,
  10. primarily because Wells Fargo acted so completely
  11. contrary to the interests of its customers.
  12. What do you think?
  13. When studying ethics, it is often helpful
  14. to use examples like Wells Fargo and other cases
  15. to consider possible outcomes and
  16. actions in real life ways.
  17. Throughout the course we will share cases like this
  18. in part to help you learn specific principles, but also
  19. to help you to create value judgments for your own life.
  20. To help you create a moral code, so to speak.
  21. By so doing, we hope that you will come to a
  22. clearer definition of personal ethics in
  23. your own life and career.
  24. And while there are many different ethical concepts
  25. that we could discuss throughout the course,
  26. we are primarily going to focus on
  27. five key ethics principles.
  28. Those five key ethics principles are:
  29. trust, proximity, accountability, cultural lag,
  30. and privacy.
  31. Some of these concepts, like trust and accountability,
  32. will be really familiar and easy to understand.
  33. But some of the others, especially proximity and
  34. cultural lag, might take some additional study.
  35. And please also keep in mind,
  36. even though the basic premise of some concepts
  37. might be familiar and easy to understand,
  38. the challenge is to extrapolate out
  39. and consider how those concepts are going to affect us
  40. as technologies change in the future.
  41. For example, while we all understand the basic meaning
  42. of the term “privacy,” how do you think that concept
  43. will adapt and change with the advent of AI
  44. and facial recognition software?
  45. In this class we will ask you to look into the future a bit
  46. and try to predict what likely but unexpected
  47. consequences will result,
  48. whether good or bad.
  49. Okay,
  50. so let’s get started with the first key ethics principle:
  51. trust.
  52. We already mentioned trust a lot in this module,
  53. and this is probably the simplest concept to understand.
  54. For example, it doesn’t take a finance or law degree to
  55. understand that the deceptive practices of Wells Fargo
  56. and its staff was incredibly unethical,
  57. and likely criminal.
  58. So we are not going to dwell too much on the
  59. concept of trust now.
  60. But it is worth repeating yet again
  61. that the entire financial system is built on trust,
  62. and therefore the bulk of criminal financial law
  63. punish any breach of trust,
  64. or what we professionally call “fiduciary” obligations.
  65. And as a side note, for those of you who are familiar
  66. with the term “fiduciary,”
  67. it might interest you to know that the Latin root
  68. for the word literally means
  69. “one who holds something in trust.”
  70. Whether it was 250 years ago in a small
  71. European village where everyone knew each other,
  72. or in the much more complicated global marketplace
  73. that we have today, we have to understand
  74. that without a certain level of trust,
  75. the entire economic system comes crumbling down.
  76. Both traditional financial players and new fintech
  77. innovators must keep this in mind,
  78. and ensure that their products and services
  79. continue to enhance trust.
  80. In fact, because financial institutions play such
  81. an important role in society,
  82. and since most people are so clueless about
  83. complex financial products,
  84. most countries actually have disclosure requirements,
  85. meaning that banks have to be truthful and transparent
  86. with their customers,
  87. making sure they understand the nature of what they are
  88. buying or investing in.
  89. If banks are not forthright about material information,
  90. they can have significant penalties,
  91. including fines and possibly jail time.
  92. In other words, financial institutions have a
  93. higher level of trust placed on them by society,
  94. so therefore they have higher penalties if they
  95. breach that trust.
  96. As a result, one of the major considerations
  97. relating to FinTech revolves around the need
  98. to ensure that all fintech applications
  99. and innovations enhance social and consumer trust,
  100. rather than diminish it.
  101. It would be unethical, for example, for
  102. unsafe or unclear financial products to be introduced
  103. into the market via a new FinTech app.
  104. Unfortunately, some early iterations of fintech
  105. have only caused the public to question the
  106. ethical use of these technologies.
  107. For example, the use of cryptocurrency to
  108. facilitate crimes has caused many people alarm.
  109. We need to address these concerns right
  110. from the beginning and ensure that tech innovators
  111. and finance professionals
  112. consider not only the bottom line,
  113. but also the importance of
  114. maintaining balance and trust in society.

1.3.2 Key Ethics Principle – Proximity

  1. The second core ethics principle that
  2. we will be discussing throughout the course concerns
  3. the concept of proximity.
  4. In psychology, the concept of “proximity” is a key
  5. variable in explaining behavior in many circumstances.
  6. Proximity denotes both how physically close
  7. or emotionally close we are to someone or something.
  8. And differences in proximity can lead to
  9. varied outcomes.
  10. One story that demonstrates the impact of proximity,
  11. is the classic trolley problem.
  12. You may recall a teacher explaining it to you
  13. when you were younger.
  14. If this doesn’t ring a bell, don’t worry,
  15. we’ll do a quick recap.
  16. The typical version of the trolley problem
  17. usually compares two scenarios
  18. where there is a runaway trolley about to
  19. hit a group of five people.
  20. In the first scenario,
  21. you have the choice to divert the trolley with a switch,
  22. pulling a lever which would change the trolley’s direction
  23. and kill one person instead of the group of five.
  24. In the second scenario, instead of a switch,
  25. you are required to physically push a person
  26. in front of the trolley to stop it –
  27. thus saving the group of five
  28. – but killing the person you pushed.
  29. Both actions lead to a similar outcome,
  30. and yet the way that our brains process the situations
  31. is completely different.
  32. The trolley problem has been reviewed
  33. and studied many times, and each case,
  34. nearly everyone opts to divert the trolley
  35. using the switch, and nearly all object to pushing
  36. a person into its path.
  37. This dichotomy highlights the importance of proximity
  38. in people’s decision-making.
  39. If an action is proximate, physically or emotionally,
  40. then we often rely on the “moral” center of our brain
  41. to consider the dilemma.
  42. That is represented by the fact that
  43. almost everyone chooses to not push the man
  44. on to the tracks directly.
  45. Conversely, if an action is non-proximate in nature,
  46. meaning the action and its outcome are separated
  47. even slightly, then we often rely on the “logic,”
  48. or cost-benefit center of our brain to
  49. consider the dilemma.
  50. That is represented by the fact that
  51. nearly everyone opts to pull the lever,
  52. even though the action leads to nearly the same
  53. outcome as pushing the man.
  54. Now this is very important because our world
  55. is increasingly distant and non-proximate in nature,
  56. resulting in our leaders increasingly using amoral,
  57. cost benefit analysis when making decisions
  58. that can affect broad sectors of society.
  59. Let’s recall the Wells Fargo example we just discussed.
  60. If you compare Wells Fargo, a large, international bank,
  61. to perhaps a bank in a small town,
  62. the role of proximity is pretty clear.
  63. Psychologically speaking,
  64. it’s generally much harder to cheat people
  65. we are proximate to,
  66. people we interact with on a daily basis,
  67. compared to a customer that is just a number,
  68. one person that is part of a large mass.
  69. Accordingly, the concept of proximity applies to FinTech
  70. also.
  71. One great outcome of FinTech is that
  72. it will provide financial access
  73. to a greater number of people,
  74. those that are unbanked or underbanked.
  75. At the same time though,
  76. this technology will probably require less
  77. human interaction, meaning less proximity as well.
  78. So does that mean as proximity declines,
  79. people will lean towards cheating each other more?
  80. Who knows, but what is clear is that
  81. we want new innovations to bring us closer together
  82. and not drive us further apart.

Additional Readings

  • Greene, J. D., Sommerville, R. B., Nystrom, L. E., Darley, J. M., & Cohen, J. D. (2001). An fMRI Investigation of Emotional Engagement in Moral Judgment. Science, 293(5537), 2105-2108. Retrieved from https://science.sciencemag.org/content/293/5537/2105

1.3.3 Key Ethics Principle – Accountability

  1. The third key ethics principle that we will
  2. discuss throughout the course is accountability.
  3. Accountability is really a subset of
  4. governance and regulation, and is essentially a question
  5. about fairness and who is responsible when
  6. things go wrong.
  7. Many of the governance structures that we rely on in
  8. society try to make it clear who
  9. is accountable when a problem arises.
  10. But as you will see throughout the course,
  11. as the world gets less and less proximate,
  12. it is simultaneously getting harder to determine
  13. who should be held accountable for certain injuries.
  14. And FinTech innovations may be making all of this
  15. even harder.
  16. Consider the Wells Fargo case we discussed earlier,
  17. were the people responsible for violating
  18. customer trust actually held accountable?
  19. As mentioned, the bank’s initial reaction
  20. was to terminate 5,300 mid-level managers for
  21. their involvement in the program.
  22. But what about the leaders who created
  23. and pushed the program?
  24. It seems pretty clear in that case
  25. there was an accountability gap.
  26. This question of accountability is
  27. also relevant for technology.
  28. Consider a social media platform that
  29. you frequently use, say Facebook, Twitter, YouTube
  30. or their equivalents in your country.
  31. If there is inaccurate or even harmful content
  32. posted there, who is accountable for that?
  33. Surely, we would say the individual that
  34. created and posted it.
  35. But should the technology platform itself
  36. that’s hosting the content also be responsible?
  37. This is an important question and in the wake of
  38. fake news and some really tragic incidents,
  39. there is understandably a lot of debate about
  40. who should be accountable.
  41. In some countries like Singapore,
  42. we may have an initial answer.
  43. Singapore is planning to implement a new law that
  44. will require online media outlets to issue warnings,
  45. possibly correct, and in some situations
  46. even force companies to take down content
  47. that is false.
  48. Prior to this, such platforms could act at their
  49. own discretion to close accounts or
  50. limit false information.
  51. Perhaps, not anymore.
  52. The United Kingdom may eventually go even further
  53. in their efforts to regulate the internet through a recently
  54. proposed law that would make technology companies
  55. more legally liable for the content they host through
  56. fines, penalties, and direct litigation.
  57. Areas that the possible new law would cover
  58. include content that supports violence, terrorism,
  59. promotes suicide, spreads false information,
  60. and even cyberbullying.
  61. So when considering accountability
  62. for technology companies, including FinTech firms,
  63. it seems that society may no longer be satisfied with
  64. attempts at self-regulation, which then raises
  65. the broad question of how large technology companies
  66. should be regulated.
  67. Additionally, should TechFins be regulated and
  68. treated differently than large financial firms?
  69. What should be the standard
  70. and should that standard be global?

1.3.4.1 Key Ethics Principle – Cultural Lag

  1. The fourth key ethics principle that
  2. we will discuss throughout the course is cultural lag,
  3. which is the idea that it takes time
  4. for culture to catch up with technological innovations,
  5. and that social problems and conflicts
  6. are caused by this lag.
  7. Till now we have talked mostly about finance,
  8. but FinTech is not only about finance;
  9. that’s the Fin, but there’s also the Tech, the technology.
  10. And cultural lag considers the best way to
  11. ethically introduce new technologies
  12. into the marketplace.
  13. Technological innovations are often
  14. characterized by one word:
  15. Disruption.
  16. If you pay attention to Silicon Valley,
  17. it seems like someone is talking about disruption
  18. every few minutes.
  19. “We’re going to disrupt this industry.”
  20. Or “This innovation is built for disruption.”
  21. And while not everything out of Silicon Valley
  22. is really “disruptive,”
  23. many amazing disruptions and innovations
  24. have propelled humankind.
  25. And the pace of disruption seems to be increasing.
  26. Humankind has progressed more technologically
  27. in the past 200 years
  28. than the previous 20,000 years combined.
  29. But is disruption always good?
  30. And even when the overall impact is positive,
  31. are there ethical issues that should be considered
  32. when introducing innovative disruptions?
  33. The answer obviously is yes,
  34. but we seldom talk or think about
  35. these ethical questions until after the
  36. technology has been introduced, which is often too late.
  37. As mentioned in the Introduction to Fintech course,
  38. as human beings,
  39. we tend to overestimate the effect of technology
  40. in the short run
  41. and underestimate the effect in the long run.
  42. This seems obvious, right?
  43. For example, just think about the far-reaching impacts
  44. that smartphones have had since their introduction.
  45. Can you believe that smartphones were first introduced
  46. only around 10 years ago?
  47. I guess for some of you younger students,
  48. that might not seem like a long time ago.
  49. But for a lot of us, that seems like only yesterday.
  50. Either way, the point is that it has only been 10 years,
  51. but think about how much of an impact
  52. smartphones have had!
  53. Pretty much everyone has them,
  54. and that includes a large swath of the developing world.
  55. And many of the most amazing FinTech innovations
  56. are only possible because of the smartphones
  57. that all of us are carrying around today.
  58. But here’s the thing:
  59. smartphones became popular so quickly that we,
  60. as a society, didn’t really have time to understand the
  61. implications of the technology on our broader culture.
  62. And every time we started to adapt
  63. and adjust to the technology,
  64. tech innovators would adjust and make some new
  65. feature to stay ahead of our adjustment period.
  66. These are all examples of cultural lag,
  67. and show that technology is able to change
  68. more quickly than society can culturally adapt
  69. to such innovations.
  70. And there’s one important aspect of cultural lag theory
  71. that we need to understand:
  72. sociologists and economists believe that many of
  73. society’s most challenging problems
  74. are often caused by cultural lag.
  75. Again, think about smartphones.
  76. Experts in many disciplines are now emphasizing
  77. that smartphones are actually creating or
  78. reinforcing serious social problems.
  79. We have all heard reports that emphasize that
  80. we spend too much time looking at our smartphones
  81. focusing on social media to the exclusion
  82. of our actual social circle.
  83. After a decade of not really understanding
  84. the implications of these habits,
  85. people are now working to reduce their screen time,
  86. and many technology firms like Apple and Google
  87. have introduced products to help track
  88. and even lessen screen time, encouraging users
  89. to spend less time on their phones.
  90. There are many more serious examples highlighting
  91. the gap between changes in technology,
  92. which occur very quickly,
  93. and subsequent adaptations in our culture,
  94. which happen very slowly.
  95. And the smartphone example is only
  96. a very simple example.
  97. The reality is that some of the biggest problems
  98. society faces – things that are so big that
  99. we sometimes have trouble seeing or understanding
  100. them – are often tied to technological disruption
  101. and the cultural lag that stems from them.
  102. And while these massive innovations are rightfully
  103. celebrated for their positive impact,
  104. it’s worth considering some correlated points.
  105. For instance, what happens to all the people
  106. who work in industries that are made obsolete
  107. because of the new technologies?
  108. Certainly a lot of people have benefited
  109. from technological innovations, but not everyone does.
  110. Or at least, maybe people don’t benefit equally.
  111. Is it morally necessary for new technologies
  112. to benefit all of society?
  113. And even if that is possible,
  114. should that be an overall goal?
  115. Should that be a normative aspiration
  116. of new technological innovations?
  117. Let’s consider another example:
  118. drones.
  119. Do you have one?
  120. Or do you know someone who does?
  121. They are now pretty popular,
  122. and became so popular so quickly
  123. over the last few years
  124. that governments were caught off guard without
  125. regulations specifically covering private drone use.
  126. And there are some scary aspects of drone use
  127. that people may not have considered previously.
  128. For example, people have weaponized drones,
  129. with one drone even being used to attempt an
  130. assassination of a state leader.
  131. And while some companies are using drones in Africa
  132. to deliver blood for transfusions,
  133. there are also people using drones to drop
  134. contraband into jails and prisons,
  135. or to smuggle drugs across borders.
  136. When considering cultural lag,
  137. laws are some of the slowest changing
  138. aspects of culture.
  139. It can easily take years for even simple laws
  140. to be enacted.
  141. As a result, when drone technology rapidly advanced,
  142. making them affordable for almost anyone,
  143. governments raced to catch up, creating regulations
  144. to help balance public safety with personal recreation.
  145. As is probably clear,
  146. it’s hard to hold someone accountable
  147. for improper drone use
  148. if there is no law defining proper drone use.
  149. Thus, the cultural lag created between
  150. the rapid advancement of drone technology and
  151. the much slower development of drone-related laws
  152. has created some serious concerns,
  153. including disruptions of airports, concerns about privacy
  154. and use of drone cameras around personal residences,
  155. military installations, and other sensitive locations.
  156. So when new technologies are introduced,
  157. and these gaps or lags are created,
  158. who should be responsible for
  159. the negative consequences?
  160. The innovators? And the inventors?
  161. The government?
  162. The users?
  163. Governments around the world have been grappling with
  164. questions like these for a long time,
  165. and some disruptive FinTech innovations
  166. are going to pose very significant challenges
  167. for regulators
  168. – and some already do.

Additional Readings

  • Ogburn, W. F. (1957). Cultural Lag as Theory. Sociology & Social Research, 41(3), 167-174.
  • Marshall, K. P. (1999). Has Technology Introduced New Ethical Problems? Journal of Business Ethics, 19(1), 81-90. Retrieved from https://www.jstor.org/stable/25074076?seq=1#metadata_info_tab_contents
  • Brinkman, R. L., & Brinkman, J. E. (1997). Cultural lag: Conception and Theory. International Journal of Social Economics, 24(6), 609-627. Retrieved from https://www.emeraldinsight.com/doi/abs/10.1108/03068299710179026

1.3.4.2 Productivity Shifts and Technological Revolutions

  1. Okay, if you are watching this course,
  2. chances are that you work in some type of service
  3. industry like finance, law or accounting.
  4. If so, what is the difference between your chosen career,
  5. versus, let’s say, a farmer
  6. or some other type of blue-collar worker?
  7. For a lot of us, we choose our careers based on security
  8. – industries that we think are safe
  9. – but here’s the reality:
  10. you are a lot more like a farmer than you may realize.
  11. So now you understand cultural lag
  12. and both the challenges and moral implications
  13. that come from introducing disruptive technologies.
  14. From a FinTech perspective,
  15. there are some exciting disruptions
  16. that are right around the corner.
  17. And while these innovations will make many aspects of
  18. life much easier, there is one major challenge
  19. that we feel like we need to address:
  20. the social ramifications from unemployment
  21. and job loss.
  22. Okay, to get this point across,
  23. we need to go back in time again.
  24. Early communities of humans congregated
  25. around each other in villages for specific reasons.
  26. Obviously, protection and socialization were
  27. among those reasons, but there was one overarching
  28. activity that held early societies together:
  29. food.
  30. Early human communities revolved around agriculture,
  31. and many of the most important early innovations
  32. revolved around the growing, harvesting,
  33. and storage of food.
  34. During the Bronze and Iron Ages,
  35. stone and wooden tools were replaced by
  36. more efficient metal tools,
  37. but main processes in agriculture remained largely
  38. unchanged for thousands of years.
  39. But that changed quickly during the
  40. Industrial Revolution.
  41. In many parts of the world,
  42. during the Industrial Revolution, horse-drawn and even
  43. mechanized harvesting equipment were introduced
  44. leading to a vast increase in productivity.
  45. This not only sped up the time in which crops
  46. could be planted and harvested,
  47. but it also significantly increased crop yields.
  48. During that time, the number of people working
  49. in agriculture dropped,
  50. but the amount of land that could productively be used
  51. to grow crops grew substantially.
  52. This led to fewer but larger farms.
  53. To put it simply, fewer people were needed to farm,
  54. but simultaneously more food was grown.
  55. In fact,
  56. some historians contend that these improvements
  57. in agriculture “permitted” the Industrial Revolution
  58. because the increase in food production and
  59. decreased need for farm labor meant that more people
  60. could work in urban industries,
  61. providing labor for factories,
  62. large urban utility projects,
  63. and really all the innovation that led
  64. to the rise of the 20th century.
  65. But there were a few obvious problems
  66. that stemmed from this.
  67. First, these advances didn’t occur everywhere.
  68. For example, many of the countries that are still
  69. developing today did not participate equally in these
  70. advancements for a variety of reasons, and as a result,
  71. their economic progress was delayed.
  72. And even in developed countries where these advancements were adopted broadly,
  73. the benefits were not distributed equally.
  74. But there was another more serious issue.
  75. While the machines and tools that were introduced
  76. improved productivity, they also made many jobs
  77. redundant, leaving millions out of a job, and needing to
  78. transition to an entirely new industry.
  79. In the United States alone, agricultural jobs transitioned
  80. from 40% of the workforce in 1900 to only 2% in 2000.
  81. That’s only 100 years!
  82. And while that may seem like a really long time,
  83. in terms of human history that is incredibly short.
  84. So where did all the old farmers go?
  85. Well, the lucky ones were able to find even better jobs in
  86. manufacturing, or even in services related to the
  87. agriculture industry, like logistics, storage, or marketing.
  88. The point is that they had to reinvent themselves,
  89. and rethink the way they defined work.
  90. For many people, this was a great opportunity,
  91. but of course, others got left behind in the process.
  92. Now consider today.
  93. We now have unmanned drones that can plant seeds,
  94. spray and monitor the health of crops,
  95. and even harvest them.
  96. And artificial intelligence and machine learning are being
  97. integrated to help farmers make better decisions
  98. and monitor growth in real-time.
  99. As a result of all this, less and less human labor
  100. is needed to run mega farms.
  101. But for most developed economies,
  102. these changes started more than a century ago.
  103. But what happens when these modern technologies are
  104. integrated into developing countries today?
  105. Well, let’s look at an example.
  106. From 1990 to 2017 – just 27 years –
  107. It is estimated that in China agriculture employment
  108. went from about 55% of China’s population
  109. to about 17%.
  110. That’s a difference of several hundred million jobs.
  111. And while China has done a good job of expanding
  112. its economy and transitioning those workers
  113. to the manufacturing sector,
  114. you can see how difficult it can be
  115. when productivity shifts make workers obsolete.
  116. In fact, speaking of manufacturing,
  117. this is also happening in that sector as well.
  118. Throughout the US and EU, manufacturing jobs
  119. have been slashed through a combination of
  120. innovation and automation.
  121. Many of the workers who lost their employment
  122. have yet to be fully reintegrated back into the workforce,
  123. leading to significant political pressure
  124. and social anxiety.
  125. One question that bears asking is who is responsible
  126. for ensuring these new innovative technologies
  127. are integrated in such a way that
  128. social harm is minimized?
  129. Is that the role of the innovator?
  130. Is that the role of the government? Or someone else?
  131. Okay, so why does any of this matter?
  132. And what does it have to do with FinTech?
  133. Well, if predictions can be believed, we are
  134. about to enter the Fourth Industrial Revolution,
  135. which could bring the most significant disruption
  136. and productivity shifts humankind has ever seen.
  137. Artificial intelligence, blockchain, other new technologies
  138. will completely alter not only how we work,
  139. but our entire perception of what work is.
  140. Let’s use 2 concrete examples: cashiers and drivers.
  141. Cashiers – the people you pay when you leave the store
  142. – and drivers.
  143. Well, these jobs are the two most common jobs
  144. in the United States,
  145. and it’s the case in many other developed countries
  146. as well.
  147. Well, millions of those jobs are likely to be eliminated
  148. within the next 10 years as automation,
  149. driverless vehicles, and other FinTech innovations
  150. make them obsolete.
  151. In fact, it has been estimated that 38% of
  152. current US jobs are at high risk of being
  153. made redundant by robots and automation
  154. in the next 15 years
  155. – that represents about 60 million jobs,
  156. or 1/5th of the entire population of the United States.
  157. And although some new jobs will be invented
  158. that many of these workers will be able to take up,
  159. unlike previous productivity shifts,
  160. these newer innovations are largely replacing
  161. human workers completely making it difficult for the
  162. unemployed to simply shift back into new work.
  163. For example, for a farmer to go work in a factory,
  164. new skills are usually not required.
  165. But for a cashier or truck driver to become
  166. a computer programmer or robotics engineer,
  167. an entire new skill-set requiring years of schooling and
  168. training would be required.
  169. So now let me turn the question back on you:
  170. what are you doing to ensure that
  171. your profession and your career,
  172. you’re ensuring that you don’t become redundant
  173. or that you can stay ahead
  174. of these new emerging technologies?

Additional Readings

  • Schwab, K. (2017). The fourth industrial revolution (First ed.). New York: Currency.
  • Fourth Industrial Revolutions. World Economic Forum. Retrieved from: https://www.weforum.org/focus/fourth-industrial-revolution
  • Will Robots Steal Our Jobs? The Potential Impact of Automation on the UK and Other Major Economies. PricewaterhouseCoopers. (2017). Retrieved from: www.pwc.co.uk/economic-services/ukeo/pwcukeo-section-4-automation-march-2017-v2.pdf
  • Solon, O. (2016). Self-Driving Trucks: What’s the Future for America’s 3.5 Million Truckers? The Guardian. Retrieved from www.theguardian.com/technology/2016/jun/17/self-driving-trucks-impact-on-drivers-jobs-us
  • Belton, P. (2016). In the Future, Will Farming be Fully Automated? BBC. Retrieved from www.bbc.com/news/business-38089984

1.3.5 Key Ethics Principle – Privacy

  1. The fifth and final key ethics principle that
  2. we will discuss throughout the course is privacy.
  3. The debates waging around the world around the
  4. concept of privacy are the amalgamation
  5. of all the concepts we will cover in this course, including:
  6. trust, proximity, accountability, and cultural lag.
  7. Privacy is one of the key issues of our time,
  8. and is something that we need to start thinking
  9. much more deeply about.
  10. Among all their other problems already discussed,
  11. Wells Fargo also experienced privacy and data
  12. breaches.
  13. For example, in 2017 Wells Fargo accidentally
  14. sent out 1.4 gigabytes of files containing
  15. personal information of about 50,000 of its
  16. wealthiest clients, including their social security
  17. numbers and personal financial data.
  18. Luckily, that data breach was fairly limited in its reach,
  19. but what if the data got shared
  20. on the web for all to see?
  21. Who specifically should be held accountable
  22. for such a breach?
  23. That’s actually a surprisingly hard question to answer.
  24. Questions relating to the right to privacy are not new.
  25. But with the advent of smartphones,
  26. facial recognition software, machine learning, and
  27. other FinTech innovations, our right to privacy
  28. in a traditional sense is diminishing rapidly.
  29. For example, people are increasingly worried about the
  30. possibility of being tracked by
  31. their smartphone hardware.
  32. And many common apps have been breached
  33. or even actively misuse
  34. our private personal information.
  35. For example, Facebook has been embattled over
  36. the past couple of years for concerns relating to privacy.
  37. As one of the most actively used social media
  38. platforms in the world,
  39. Facebook is accused of allowing private customer
  40. information to be used for
  41. several unwelcomed activities.
  42. They have even been accused of allowing their
  43. platform to covertly influence political elections.
  44. Other new technologies, such as voice recognition
  45. products and wearable devices,
  46. have people worried about who is listening to
  47. and possibly recording their private conversations.
  48. Just think about it: on a daily basis the majority of us
  49. click “I Accept” on so many websites without actually
  50. reading the terms and conditions
  51. that we are desensitized to the fact that these are
  52. actually real legal agreements.
  53. If we as society are going to take privacy seriously,
  54. we need to consider the moral and legal implications
  55. in a practical context – and ensure that we are clear on
  56. what rights we are giving away.
  57. Maintaining a balance between privacy and profitability
  58. in the commercial sector,
  59. or security in the public sphere,
  60. is an increasingly important challenge.
  61. But has the age of privacy in the traditional sense already ended?
  62. Have we already given up so much data via
  63. social media and smartphones that there is
  64. no turning back?
  65. And should the race to create sentient AI,
  66. which requires massive amounts of data,
  67. take precedence over personal privacy?
  68. These questions, and many more will be discussed
  69. throughout this course,
  70. and we look forward to hearing your thoughts
  71. on how to best navigate these tricky privacy waters.

Additional Readings

  • Kovaleski, S. F., & Cowley, S. (2017). Wells Fargo Accidentally Releases Trove of Data on Wealthy Clients. The New York Times. Retrieved from https://www.nytimes.com/2017/07/21/business/dealbook/wells-fargo-confidential-data-release.html

1.4 Module 1 Conclusion

  1. Throughout this module we have considered
  2. some of the underlying reasons we have money
  3. and financial institutions in the first place,
  4. thus helping us to understand the ethical foundations of
  5. FinTech innovations.
  6. The reality is that money is a societal construct
  7. based on trust, and the value ascribed to
  8. money is somewhat subjective.
  9. As a result, for centuries societies have relied on a
  10. shared definition of monetary value,
  11. as well as trust in banks, to ensure our money and
  12. economies are stable and secure.
  13. But unfortunately, throughout history
  14. including the years since the financial crisis,
  15. some financial institutions have forgotten their
  16. important role in society,
  17. and have breached that foundation of trust.
  18. This has led many to embrace non-traditional
  19. FinTech innovations as a way to democratize finance,
  20. and potentially move away from
  21. traditional financial industry players.
  22. Both finance and FinTech companies
  23. need to keep this in mind,
  24. and ensure that their innovations
  25. provide the highest level of societal trust possible.
  26. By walking through the Wells Fargo case,
  27. we have introduced each of the key ethical principles
  28. that will be highlighted throughout this course.
  29. Once again, those principles are:
  30. trust,
  31. proximity,
  32. accountability,
  33. cultural lag,
  34. and privacy.
  35. Keep those in mind as we
  36. proceed throughout the course.
  37. Next, in module 2 we will introduce a technology
  38. that most of you have heard of before:
  39. blockchain.
  40. While many are excited about the many efficient
  41. and cost-saving uses of blockchain,
  42. others have highlighted its use in
  43. facilitating illegal activities.
  44. Let’s consider both of these together,
  45. to hopefully ensure the use of blockchain will be ethical,
  46. and will help lead us toward a more utopian society.

Module 1 Roundup

  1. – Hi everybody.
  2. Welcome to the weekly wrap-up where we discuss
  3. various course related matters.
  4. First, we want to give a huge thank you
  5. to everyone who’s participated in the course so far.
  6. We’ve had a really great response
  7. and we’re so happy for all the amazing comments
  8. on the discussion forums.
  9. – Yeah, the response has been great and currently
  10. there are 4777 of you enrolled in the course
  11. from 154 countries or regions around the world.
  12. Which is great.
  13. We’re so thrilled to see many parts of the world represented
  14. and are especially grateful for those of you
  15. who are including specific examples from your home countries
  16. and cultures in the discussion forums.
  17. Finance, fintech and tech disruptions are affecting
  18. various parts of the world in different ways.
  19. So it’s great to hear local perspectives
  20. on everything we’re doing.
  21. – Yeah.
  22. In response to the poll questions,
  23. it was great to see that most of you believe
  24. an hour of education is worth more than 100 eggs.
  25. So I guess that means that you value education,
  26. which bodes well for our future as educators.
  27. But we also think it’s cool that most of you
  28. wanted the whole chicken.
  29. It shows you’re savvy negotiators.
  30. – Now we also thought it was interesting
  31. that the majority of you out there still trusts banks
  32. over fintech startups and techfins
  33. but 28% of you out there don’t trust any of them.
  34. Now that’s a really important statistic
  35. that we hope you explore personally
  36. and in the discussion forums as the course continues.
  37. You know its incredibly important
  38. that financial service firms do a better job
  39. establishing trust in the market place.
  40. And if you all out there are representative of the market,
  41. then it’s clear many people around the world
  42. do not trust financial firms either.
  43. – And speaking of trusting banks,
  44. many of you commented that you trust banks more
  45. because they are better regulated.
  46. And your deposits are insured.
  47. Now while we agree and think that those are really
  48. reasonable ideas, it does make us wonder.
  49. Isn’t one of the major focuses of fintech innovation
  50. to avoid regulation and government intervention?
  51. And isn’t regulation one of the things that makes
  52. banks inefficient and hard to deal with in the first place?
  53. Now we look forward to hearing your feedback
  54. as the course moves on to see how the fintech
  55. and techfin firms can continue to build trust
  56. while maintaining their efficiencies.
  57. – Now when thinking about proximity,
  58. 77% of you said that you would pull the lever
  59. to divert the trolley from hitting the five people
  60. instead killing the one person.
  61. But on the other hand, 61% of you said
  62. that you would not push the man off the bridge
  63. to save the five people.
  64. These results largely mirror what academics have found
  65. as they researched these questions in the past.
  66. This is a great example of how proximity
  67. can alter our decision-making.
  68. – And you’ve provided a lot of really great examples
  69. of proximity affecting our behaviour in real life.
  70. Now some of you mentioned how we’d be willing to donate
  71. our organs to save a loved one,
  72. but maybe not do the same thing for strangers.
  73. Or how we often donate to our churches or charities
  74. in our local communities, but then don’t do the same thing
  75. for problems that are affecting people far away.
  76. And one of you mentioned about stealing
  77. from a bank versus cyber crime.
  78. Now we’re gonna talk about issues of bank theft
  79. in the next couple of modules,
  80. so remember that one for later.
  81. – Now, concerning the disruption that will come
  82. from tech innovation.
  83. A full 60% of you said that you were either very
  84. or somewhat concerned that the disruption and the resulting
  85. job loss would create broad social problems.
  86. And interestingly when asked what industry you thought
  87. were the most at risk of disruption,
  88. 41% said accounting and auditing,
  89. while 34% said finance.
  90. By far the two most common answers.
  91. Is that because most of you are from the accounting
  92. and or financing industry?
  93. And therefor you see the risk of automation
  94. in those industries?
  95. Or is this just an overall observation?
  96. – Yeah, I was wondering about that.
  97. Now either way, we hope that you’re all thinking
  98. more deeply about your life and your career
  99. and how you can kind of future-proof yourself.
  100. Meaning planning ahead to ensure that you’re
  101. not made redundant and are always employed
  102. in doing something that you love
  103. and that’s meaningful for society.
  104. – Now, we also appreciated all the great comments
  105. about cultural lag.
  106. We know that this is probably a new concept for many of you,
  107. but basically cultural lag means that technology
  108. adapts and changes faster than culture.
  109. Especially in areas like law and religion.
  110. And as a result sometimes technology changes so fast
  111. that it takes a while for us to realise the negative impacts
  112. that technology may be having on society.
  113. – And you provided really good examples
  114. of cultural lag in everyday life.
  115. Including the impacts of social media on teenagers,
  116. especially from cyber bullying
  117. which is kind of a sad but really relevant thing.
  118. The spreading of fake news and false stories
  119. via social media which is made possible
  120. because of the omnipresence of smartphone technology.
  121. And the environmental costs of mining Bitcoin
  122. and that’s one of the topics that we’re gonna discuss
  123. in the next module.
  124. – Finally we also learned that students
  125. out there all over the world are smart
  126. and disciplined with their money.
  127. When asked what you would do with a million dollars
  128. you all answered really responsibly.
  129. We were expecting some really crazy answers
  130. like trips to Las Vegas or buying an island or something
  131. but most of you talked about paying off debt,
  132. investing for the future, particularly kids’ schooling
  133. and helping the community even.
  134. Now, in our next module, module two,
  135. we’re gonna look at blockchain and it’s governance.
  136. Now we won’t cover blockchain in too much technical
  137. detail in the next module, but more look at some of
  138. the policy implications and the governance implications
  139. of what blockchain technology means to us.
  140. We’re gonna look at a few interesting cases
  141. including the Silk Road which is a really fascinating
  142. case almost straight out of a movie.
  143. Additionally we will look at some technical details
  144. of things related to like smart contracts
  145. and what that means.
  146. We’ll also look at form remittances
  147. and how blockchain comes in and plays a significant role
  148. in transferring money from place to place.
  149. Now there are obviously technical details around that.
  150. Now we won’t go into specific coding or anything like that
  151. or crypto-currency or anything along those lines.
  152. But we’re gonna look at some real-world implications
  153. of this new technology and how it really impacts people.
  154. And hopefully in some ways make improves their lives.
  155. – Okay, so you’re not gonna be telling them
  156. what crypto-currencies to buy?
  157. – Well unfortunately we don’t have that knowledge.
  158. If we did, we probably wouldn’t be sitting here
  159. and earning professor, teacher salaries from the university.
  160. All we can say is make sure you do your research.
  161. – Yeah and as the course moves on, we hope that you all
  162. still really maintain your engagement
  163. and especially within the discussion forums.
  164. One of the things that we’re super excited to see
  165. were all the comments from all over the world.
  166. We had comments from those in Kenya,
  167. all over South America.
  168. People from Europe, North America and obviously here
  169. within the region of Asia where we are.
  170. It was super cool to see your comments
  171. especially in terms of the very personal
  172. and kinda specific way that these things apply
  173. within your local communities.
  174. Within your specific jobs and careers.
  175. And hopefully in the way that you think these
  176. are gonna be disrupting and changing the way
  177. those things operate.
  178. Your communities operate.
  179. Your career, your industries operate in the future.
  180. So please stay engaged
  181. and we’re really looking forward to more.
  182. – Yeah and I’ll go with what David Bishop has said,
  183. some of the feedback we’ve gotten has been fabulous and–
  184. – Really excellent, very thoughtful comments.
  185. – The comments have been great, which we appreciate
  186. and we, at some point we do get to all of them.
  187. One of us will look at all of them and we’ll read them
  188. and try to comment where we can.
  189. Additionally some of you have told us
  190. that you’ve finished the module,
  191. you thought it was great and you’ve passed
  192. it on to your friends and colleagues and acquaintances.
  193. So if you feel like this is really compelling information
  194. and things that we’re sharing
  195. and questions that you feel are important
  196. to think about and consider,
  197. please do I introduce it to other people
  198. around you and I think collectively we can have
  199. increased quality and quantity of the discussions
  200. because these are really important questions.
  201. – Yeah, one last point.
  202. Because I have been reached out,
  203. so as we kind of,
  204. you’re gonna learn later on.
  205. This course talks a lot about migrant workers.
  206. So I’ve actually been contacted by literally hundreds
  207. of foreign domestic workers here in Hong Kong,
  208. mostly from the Philippines and we wanna say
  209. thank you for joining us and say don’t worry
  210. if you’re not a finance expert.
  211. We’re super excited to have people
  212. from developing countries all over Asia
  213. and really all over the world
  214. that are looking into these insights
  215. and really kind of engaging in these topics
  216. and conversations with us.
  217. So if you don’t understand everything,
  218. that’s totally okay.
  219. Within the discussion forums it’s absolutely
  220. appropriate to ask questions, right?
  221. And really kind of engage with us in that level
  222. because we really hope that all people
  223. and all communities can kind of benefit
  224. from these insights going forward.
  225. – Absolutely, because one of the key components
  226. I think of our course about this idea of the ethics
  227. of new technologies, the risk of new technologies
  228. is that it should be inclusive, right?
  229. – Yeah.
  230. – A part of our mission of doing this course
  231. is to create a greater level of financial literacy
  232. as well as digital literacy around some of these new trends.
  233. That people have the right questions
  234. and think about kind of the right factors
  235. to weigh as we go about in this kind of aspect
  236. of the new world and technological future that we have.
  237. – Yeah, especially because it kinda gives the wrong
  238. impression that we actually know the answers.
  239. (laughing)
  240. I mean the reality is these are very complicated
  241. very complex problems that really haven’t even fully
  242. developed yet and so again this is meant to be
  243. a broad conversation and so we really appreciate
  244. all people of all levels joining in
  245. and kinda sharing your insights.
  246. Asking your questions and we’ll be asking them too
  247. and we’re very sincere in that because we don’t have
  248. the answers, but we hope that society and our community
  249. within the course kinda develops some insights
  250. in some of these questions together.
  251. Okay, so speaking of the great discussions we’ve had.
  252. We did want to give a shout-out to a few
  253. of the people that are posting, that are posting
  254. because we really appreciated some of their comments.
  255. So first of all, from peter-nyc.
  256. He had some really good comments and the one we want to talk
  257. about is about the way he highlighted kind of the role
  258. of or the purpose of a business
  259. and kind of the way the society defines success.
  260. So he noted that maybe the behaviour of Wells Fargo
  261. was to be expected because what we’re kind of taught
  262. in business school is that the role of business
  263. is to generate profit and especially for the shareholders.
  264. And that’s exactly what they did.
  265. And more specifically on the kind of personal side
  266. he said that for many people the definition of success
  267. is really largely kinda tied to their paycheck basically.
  268. – Yeah. – Right?
  269. So what were some of these thoughts–
  270. – Yeah, so I thought Peter’s comment
  271. or peter-nyc’s comments were super insightful
  272. but I think on the first point about what the purpose
  273. of businesses is as it relates to profitability,
  274. that has kind of entered into kind of academia
  275. and filtered into mainstream business
  276. roughly the last three to four decades.
  277. – Late 70s.
  278. – Yeah, as kind of like, as the norm.
  279. And if you didn’t do that, somehow that was incorrect
  280. and actually that’s not true and I think both of us
  281. teach that in our normal classes that we teach
  282. in business ethics and things like that.
  283. That actually outside of a few circumstances
  284. actually profitability, profit maximisation
  285. is actually not a legal requirement in many situations
  286. and in terms of schools of thought beyond
  287. kind of traditional Euro Chicago Milton Friedman-esque
  288. profit maximizations there are actually
  289. other schools of thought which I think
  290. Peter in his comments, he seems to be much more heavily
  291. influenced by management gurus like Peter Drucker
  292. who kind of subscribes to
  293. who kind of advanced some of the things
  294. I think Peter is discussing.
  295. So I think that’s important to point out that profit
  296. maximisation is not the default and doesn’t have to be
  297. an increase in just being a shift away from that.
  298. I think to the second point about how we value ourselves.
  299. – Yeah. – Is it based on a paycheck?
  300. I think that Peter rightly pointed out,
  301. peter-nyc rightly points out, that’s faulty logic too
  302. and if you are always–
  303. – Or at least it relates to maybe difficult
  304. unintended consequences.
  305. – Yeah, I mean and if you’re always measuring yourself
  306. using that as the metric, you’ll always be behind.
  307. – Yeah. – For that’s the one thing.
  308. You’ll never be happy or satisfied.
  309. But we also know of psychologically speaking
  310. that money actually doesn’t bring happiness.
  311. Now for sure, if you don’t have money,
  312. enough to cover kind of the needs
  313. that you have on a day-to-day basis,
  314. we also know that will make you unhappy psychologically.
  315. But we know that once you move beyond that,
  316. the more money you make doesn’t necessarily
  317. make you happier.
  318. – And we both know enough wealthy people, I think,
  319. to say pretty definitively that the size
  320. of your checking account doesn’t necessarily mean
  321. you earned it or you’re smarter–
  322. – Sure, sure.
  323. – Than everybody else right?
  324. – In many respects you may have been luckier
  325. than everybody else.
  326. – Yeah, yeah. A lot of it is timing and what not.
  327. So typically what I tell my students
  328. when we’re teaching business ethics
  329. is that the coolest thing about the concept of success
  330. is that it’s a completely subjective term
  331. and you get to define it.
  332. And so one of the things that we want to do
  333. within this course, the reason really I think,
  334. the underlying reason why we decided to do this anyway
  335. is because we wanted to kind of highlight a different
  336. definition of success and have people,
  337. especially in the financial and kinda tech industries,
  338. think ahead and say okay what type of future do we want
  339. and how are we going to define success?
  340. And I think peter-nyc, one of the things
  341. that’s really encouraging is to see so many
  342. leaders in the financial industry really also leading now
  343. in this kind of fight against short-termism
  344. and fighting against this idea that all you should be doing
  345. is looking at the quarterly earnings
  346. through the statements and stuff and really pushing
  347. towards more long-term investment.
  348. So you’ve got leaders at BlackRock and JP Morgan
  349. and Goldman Sachs and other places that are really,
  350. actually including Warren Buffet himself,
  351. saying that focusing only on short-term profits
  352. is probably a losing battle and it’s something
  353. that we need to move away from.
  354. – Yeah, and I think what’s also interesting
  355. some of the follow-up comments or in the follow-up
  356. comments that peter-nyc had,
  357. he talked about culture a little bit
  358. which I think is really important.
  359. I think both at the individual level
  360. but how we measure ourselves.
  361. What’s success?
  362. That’s all part of like the cultural mantra
  363. that you want to have for your personal life.
  364. But also, I think he was talking about more organisations
  365. banks and things and other institutions.
  366. The culture that you have and he shared an article
  367. where it talked about students
  368. who went to Harvard Business School.
  369. – Yeah.
  370. – Their idea was, I want to go change the world,
  371. but they come out wanting to work at an investment bank.
  372. – Yeah. – Having worked
  373. at an investment bank– – Our students.
  374. – Yeah, having worked at an investment bank,
  375. there’s nothing necessarily wrong with working
  376. at an investment bank and that’s fine,
  377. but the idea of how being in a particular culture
  378. then changes what seems to be important to you,
  379. what your aspirations are.
  380. I think that’s important to understand
  381. because I think, if you have certain values,
  382. you wanna put yourself in the culture,
  383. or create the culture around you and within you
  384. to ensure that you can live true to those values.
  385. Which I think is quite important.
  386. – Yeah, so that means peter-nyc, if you are from NYC,
  387. if you’re from New York City,
  388. which is one of the world’s other great financial centres,
  389. then share this course with your friends and colleagues
  390. and lets start changing the narrative
  391. and having people define success differently
  392. and start looking at the role of financial institutions
  393. and all companies in a more ethical
  394. and kinda morally centred way.
  395. – Thanks Peter.
  396. We’ve got another comment from JoergHK
  397. and hopefully that means you’re in Hong Kong,
  398. which is so great.
  399. But the comment was in response to the question
  400. about who is responsible for the negative consequences
  401. basically of tech innovation?
  402. And JoergHK talked about perhaps that the government,
  403. the burden of that responsibility lies on the government
  404. to find policies and fund policies
  405. through taxation to deal with the negative externalities
  406. related to these consequences of technological innovation.
  407. So that raised a very interesting debate
  408. I think on the discussion and this is something
  409. we frequently discuss between ourselves
  410. and within our classes here at The University of Hong Kong
  411. about who is responsible for potentially the displacement
  412. of labour, people who can’t find jobs.
  413. Technological advances that may make
  414. certain companies obsolete.
  415. Who are responsible for those things
  416. because on a wide scale those things will happen
  417. and already have been happening.
  418. And again, that created a bit of a discussion
  419. and we have Jessielam2018 username responded.
  420. Maybe the whole obligation didn’t actually fall
  421. on the government, but there are other stakeholders
  422. who should get involved.
  423. So that’s a great place to start
  424. with this very important question
  425. and David, what do you think?
  426. – It’s super complicated and this is really
  427. at the crux of what we want to talk about
  428. for the whole course.
  429. And so again really appreciate the kind of sophisticated
  430. dialogue, very very kind and thoughtful dialogue as well.
  431. I think one of the things we’re trying to do
  432. is to foster a place where people can disagree
  433. in a respectful way and we appreciate you
  434. for doing that.
  435. So on the one hand you have the role of the government
  436. and Joerg also mentioned, JoergHK also mentioned,
  437. statements within the World Economic Forum
  438. and the idea that, yes people are going to be displaced.
  439. – Yeah. – Right?
  440. And so there are going to be a lot of jobs
  441. that are new, invented.
  442. But there’re gonna be a lot of people who are not really
  443. able to move or upgrade within the workforce.
  444. And we’re seeing that across the world, right?
  445. So again, I’m from the US.
  446. I’m from an area where they had a lot of manufacturing,
  447. even just a decade or two ago.
  448. In fact, I used to work, you may be surprised to hear,
  449. I used to work– – That you used to work?
  450. – Yeah yeah.
  451. – That is surprising.
  452. – I used to work in a woodworking factory.
  453. So I worked in a factory in rural Southern Georgia
  454. with kinda salt of the earth, normal people
  455. and the reality is that with automation,
  456. those are the types of jobs that over the past 20 years
  457. have largely gone away.
  458. And now–
  459. – Or, moved to other countries–
  460. – Well, sure, sure, excuse me.
  461. They’ve left my area of the US, so you see a lot of areas
  462. where they used to have a lot of manufacturing work
  463. that is now gone, so in the Rust Belt for example.
  464. Ohio, Pennsylvania, etcetera.
  465. And that has had a massive influence on politics
  466. and so many aspects of everyday life.
  467. Including social ramifications such as the opiate crises
  468. which is now going on.
  469. So you can see this is like a domino effect
  470. of negative consequences and so that’s why it’s so important
  471. for us to think ahead and some of the comments
  472. that everyone made was that this is not just restricted now
  473. to manufacturing or previous to that, agriculture, right?
  474. So now we’re seeing that people are saying well,
  475. we think that finance is gonna be disrupted,
  476. we think that accounting and auditing
  477. are gonna be disrupted.
  478. We’re former lawyers, we think that the law–
  479. – The law will be–
  480. – Yeah, legal field is gonna be disrupted.
  481. And so the question is, how do we kind of reeducate
  482. and reintegrate workers back into society?
  483. I do see Jessielam2018’s point though.
  484. This is a problem.
  485. The government is typically reactive, right?
  486. – Yeah.
  487. – And often can’t be proactive in these areas
  488. and so we do need other aspects of society
  489. and from the government’s standpoint we need more
  490. proactive kind of positive incentives as well.
  491. So she mentioned tax policy for example.
  492. So you can have taxes, tax policy that encourages people
  493. to donate to charities for example.
  494. Or maybe we can create tax policies that encourage
  495. more innovation or job creation and other things.
  496. – Yeah and I think one thing that is important
  497. to point out is that this trend that David Bishop
  498. just described and that was described by JoergHK
  499. and followed on by Jessielam in the comments
  500. and discussion and by others, is that this trend
  501. of displacement of labour is actually not new.
  502. – Hmm.
  503. – This is something that we talk about a lot
  504. because of technology.
  505. Maybe it’s because that the pace of that disruption
  506. is increasing.
  507. – Right.
  508. – But even if we look back 20 or 30 years even
  509. and we look at sources of, locations of manufacturing
  510. of like electronic goods or garments or shoes even,
  511. we saw this move from, it was in Japan,
  512. then moved over to places like Taiwan and Korea.
  513. – Including here in Hong Kong. – And in Hong Kong.
  514. Then moved over to mainland China.
  515. And mainland China was the place for much
  516. of that manufacturing was of low-cost,
  517. somewhat skilled labour
  518. and that was a nice combination but as that skilled labour
  519. started to creep up in expenses,
  520. then much of that manufacture has moved either
  521. inward into other parts of China
  522. or actually to parts of South East Asia as well.
  523. – Right.
  524. – And so this pattern of displacement
  525. is not necessarily just driven by technology alone,
  526. obviously driven by cost and those combinations
  527. I think are intersecting now to make that pace
  528. of change a bit faster.
  529. Perhaps what we’ve seen and so one thing that we
  530. I think need to do as we continue this process
  531. is to place them in the proper historical context
  532. and understand that often that we like to think
  533. the situation we’re at, at that our point of history,
  534. is quite unique and frequently
  535. there could be some unique aspects of it,
  536. but most of the time something like that
  537. has probably happened in the past.
  538. Or you know kind of putting it in the proper context
  539. is helpful and I think the thing
  540. more practically speaking related to policies,
  541. is that I think if we rely on governments
  542. alone to solve the problem, the problem
  543. kind of the issue that we run into,
  544. is that there’s usually a timeline mismatch.
  545. So particularly for governments and politicians
  546. where they’re elected into office,
  547. their kind of view or their perspective of time
  548. is basically the time they’re gonna be in office
  549. until the time they’re trying to get reelected.
  550. Whereas if you, the labourer who is getting displaced,
  551. your perspective may be very different.
  552. And so if you rely solely on government to solve
  553. those problems, to be frank, that could be a bit
  554. precarious to be fair as well.
  555. And so in that capacity then who is involved?
  556. There is the government, obviously.
  557. There is the individual, but should industry
  558. be part of that process?
  559. Of course.
  560. I definitely think so.
  561. Should educational institutions like us?
  562. Of course.
  563. I think there’s a very healthy debate
  564. that could be had about there are traditional
  565. research-based universities like we’re at,
  566. The University of Hong Kong. – Right.
  567. – And they do serve their purposes in certain ways
  568. but do all institutions of kind of tertiary educations
  569. have to be like us?
  570. I don’t necessarily think so.
  571. – Right.
  572. – Could more of them be tailored to helping re-school,
  573. re-skill people who potentially
  574. are at the risk of being displaced?
  575. For sure. And I think we need to think about that
  576. and then how certain tax policies, tax credits,
  577. government policies, government credits could be applied
  578. to make that more effective.
  579. – And there are a lot of government resources
  580. that are being utilised for that purpose now.
  581. – At least in certain countries.
  582. – Schools in South Korea for example
  583. because there aren’t as many children
  584. in primary and secondary schools.
  585. They’re actually bringing in the elderly
  586. from the countryside to come in and learn
  587. how to read and actually so they can keep
  588. those resources being fully utilised.
  589. And so here at The University of Hong Kong,
  590. my colleagues and I actually created a weekend programme
  591. called Empower You, where we bring in migrant workers,
  592. foreign domestic workers, mostly from the Philippines.
  593. And they come and use the resources and are taught
  594. by professors and company leaders and stuff
  595. in areas to help them improve their skills as well.
  596. So I definitely agree with David Lee,
  597. this is not necessarily anything that is new,
  598. however, I do hope that there is one thing that is different
  599. from before, like in all aspects of society
  600. and all aspects of learning, our hope is that we can learn
  601. from the mistakes of the past and kinda forge together
  602. a better version of the future this time around.
  603. So if we are going to make a difference,
  604. if we are gonna have a better future,
  605. this means looking back at the last 30, 40 years.
  606. Understanding what went well, what we can change.
  607. So that as this new wave, this new fourth industrial
  608. revolution kind of sets in, we will have a plan
  609. in place and society doesn’t have to reap
  610. these kind of negative unintended consequences.
  611. You know one of the discussion that I found
  612. most interesting, probably ’cause it kinda hearkened
  613. back to my legal training, was the conversation
  614. about accountability and especially who
  615. should be accountable for harmful, negative,
  616. maybe even untruthful things that are posted online.
  617. Should it be the person who posts them?
  618. Should it be the platform or some combination of the two?
  619. And a lot of you kind of chimed in and said
  620. that it is the poster’s fault and the poster
  621. should be accountable for anything that is harmful.
  622. Anything that is untruthful.
  623. And so we had one comment from a user
  624. and I’m not sure how to pronounce your username.
  625. Xeilani or something?
  626. X-E-I-L-A-N-I.
  627. Really really insightful, kinda long comment
  628. that created a nice dialogue.
  629. Really focusing on how it is the person who makes
  630. the comment that it should be responsible
  631. for those words, especially if it’s untruthful.
  632. And how there, obviously the platform was also responsible
  633. for having kind of a range of tools to make sure
  634. that you know comments are being read and analysed properly.
  635. But really at the end of the day it’s kind of up to us.
  636. And so my comments back was, well totally makes sense
  637. and actually regulation which is required
  638. is probably enough or more regulation is inevitable.
  639. But one of the things that I was wondering was,
  640. who is it that we trust to define harmful,
  641. or even untruthful, right?
  642. Because unfortunately truth is often
  643. in the eye of the beholder and it’s often
  644. very difficult to kinda decide.
  645. And it kind of spurred a nice conversation.
  646. So what, where do you fall on this?
  647. I mean, there’s no answer and there’s certainly
  648. no right or wrong answer yet,
  649. but in terms of this conversation
  650. of like increased regulation
  651. and kind of punishing the people
  652. that put harmful things up there versus the platforms,
  653. I mean what are some of your thoughts on this?
  654. – Yeah, so I think kind of big picture.
  655. Like if we look at the kind of the ecosystem
  656. so to speak of the relevant parties involved,
  657. you know there’s obviously the content producer.
  658. There’s the platform.
  659. There’s the viewer.
  660. – Yeah.
  661. – Should that responsibility portion kinda amongst
  662. those main parties, or were there other parties
  663. that we’re forgetting about that should also be included.
  664. So I think, maybe that’s one of the kind of foundational
  665. questions you start with in terms of comp.
  666. If we look back at different forms of media publication,
  667. frequently it was the platform that had more responsibility
  668. for the content they put out.
  669. – And largely– – The difference is–
  670. – Well go ahead.
  671. – Here we go right?
  672. This is the difference, they’re screened.
  673. But they can still–
  674. – Well, they’re screened and publication was very very hard.
  675. Right?
  676. So there were only like a handful of newspapers,
  677. magazines whatever right?
  678. – And if you were an entry-level reporter
  679. a generation ago, maybe you started off
  680. as a fact checker.
  681. – Yeah. – Right?
  682. And so this is why there was credibility associated
  683. with many publications that were kind of well-known
  684. newspapers or well-known kind of magazines,
  685. current events magazines.
  686. Because people had a belief or faith,
  687. and they had a system that kinda filtered out
  688. the things that were incorrect.
  689. – Yeah.
  690. – And if–
  691. – And you had limited circulations
  692. so you had to be truthful because if you lose
  693. the consumer’s trust, you lose their business.
  694. – And your advertising business as well, right?
  695. And then, and part of that too is if they did find
  696. a problem, they would clearly correct it.
  697. They would say this is a correction,
  698. this was incorrect in the last article
  699. or something like that.
  700. And there was that kind of process that people
  701. felt like these were the norms
  702. of information kind of sharing and media.
  703. I think now with the platforms that we have
  704. that’s not the case obviously,
  705. because you can produce something and it will
  706. kind of basically go through no screening
  707. on a lot of platforms and will just be posted.
  708. – Yeah. – Right?
  709. And then, which is a very different kind of issue
  710. and then if there are inaccuracies there,
  711. then that process of rolling that back
  712. is a little bit more difficult than we had in the past.
  713. Now there are countries I think as we mentioned later
  714. in some of the modules that are trying to address
  715. this through legislation.
  716. Singapore is one of these countries.
  717. United Kingdom is one of these countries
  718. that are trying to introduce legislation
  719. to make platforms more responsible.
  720. Particularly for the veracity or truthfulness
  721. of the things that are posted.
  722. And those are countries that have decided
  723. to go in a particular direction to make more platforms
  724. more responsible.
  725. It remains to be seen how that plays out.
  726. How it’s implemented.
  727. Some of those laws are just planned
  728. but they haven’t passed any sort of legislation yet.
  729. And so that, one that are of interest, but two I think
  730. the one of the broader issues is for things like a lot
  731. of platforms, in terms of where they’re hosted,
  732. where some of it’s being stored.
  733. You know there’s definitely a multi-country
  734. a multi-jurisdictional component of it.
  735. – Yeah.
  736. – Just because you get one country on board
  737. doesn’t actually mean you can solve the whole problem.
  738. – Yeah. – That’s another problem.
  739. – So what you’re getting to is the kind of Lego concept.
  740. We talked about accountability but let’s be more specific
  741. about liability, right?
  742. This is where it really gets challenging but let’s say
  743. that Xeilani or however you say your username,
  744. let’s say that it should be the poster who is ultimately
  745. liable for what he or she or it, if it’s a company
  746. or bot or something, posts, right?
  747. What if that person is located in a different country?
  748. Which is very possible if not likely.
  749. Do you have zero recourse in terms of finding
  750. and then actually suing that person?
  751. Right?
  752. So this is actually really reminiscent of a legal
  753. discussion that went on actually in the 1970s
  754. tied to manufacturing very much like what we’ve just talked
  755. about a moment ago.
  756. When manufacturing was done in one country,
  757. let’s say the US, you’re manufacturing for US consumers,
  758. it’s really easy if they manufacture something
  759. incorrectly and someone is harmed by it,
  760. you just go after the manufacturer.
  761. But then what happened was, all the manufacturing in the US
  762. started going abroad, overseas, right?
  763. And so then all of a sudden, if you buy a toy
  764. or if you get a toy in a McDonald’s Happy Meal
  765. that ends up, like unfortunately harming
  766. your child, let’s say.
  767. You’d have to find the factory in that overseas country
  768. and actually go after them directly.
  769. That was almost impossible which meant that product
  770. liability as an entire legal concept,
  771. essentially became useless.
  772. And so then the result is, what they,
  773. they created something called Strict Liability,
  774. which means no fault liability.
  775. And this is complicated but basically,
  776. let me summarise by saying,
  777. the point of these laws is to ensure, if possible,
  778. that the injury doesn’t occur in the first place,
  779. but then if it does, then the person who is most
  780. in the best position to make sure it doesn’t happen,
  781. actually kinda provides compensation
  782. to make the injured party whole, okay?
  783. So what that means is, what if you get some random
  784. person that you cannot identify, who is on some island
  785. somewhere that’s making false or fake comments,
  786. who is in the best position to ensure
  787. that doesn’t happen?
  788. Maybe it’s that individual, maybe it’s Google,
  789. maybe it’s Facebook, I don’t really know.
  790. But here’s the other side of the coin though
  791. and this is where it gets really really challenging.
  792. Because when you talk about restricting speech
  793. you are unintentionally also limiting political speech.
  794. Religious speech etcetera.
  795. And this is actually playing out right now.
  796. So you have, because these large platforms,
  797. including YouTube, Facebook etcetera
  798. they’re really concerned about fake news,
  799. hate speech and these things that are kinda creating
  800. big losses for them, they’re now starting to pull
  801. off people from their platforms,
  802. including a disproportionate amount
  803. of very conservative political
  804. people that’s begun political things.
  805. Commentators.
  806. And so now you have people that sit on that end
  807. of the political spectrum are saying, wait a minute,
  808. this is discrimination, right?
  809. How is it that our ideas are somehow harmful?
  810. How come our ideas?
  811. And again that’s easy for us to say
  812. when we disagree with those ideas
  813. but what if it’s your religion?
  814. What if it’s your political ideas that are being suppressed
  815. because you’re no longer able to get on that platform?
  816. These are very real concerns that on the one hand
  817. we want to make sure that these false statements
  818. are addressed but at the same time we also
  819. have to understand that every time we limit speech,
  820. we’re limiting one of the most fundamental rights
  821. that people have.
  822. It’s very challenging.
  823. – Yeah and I think that’s a great point
  824. and it raises a connected question about
  825. do people have right,
  826. they have a right to speech in most countries.
  827. – Right.
  828. – Or a lot of countries that we deal with.
  829. But do they have a right to that platform
  830. to express their speech?
  831. – It’s a valid platform, yeah.
  832. – It raises other questions about you know kind of access
  833. and at a certain point
  834. access to certain digital platforms is that a right?
  835. Right now we generally say no, but like in this part
  836. of the debate it becomes an interesting question.
  837. I think if we go through some of the discussion points
  838. in response to the, what Xeili first brought up,
  839. there’s some really interesting points.
  840. I think peter-nyc had another great comment
  841. about this idea of how we design some of these platforms–
  842. – Yeah.
  843. – Which we thought was really insightful.
  844. If you go back and look at some of the early
  845. historical narrative around some of the wide-spread
  846. social media apps that most of us are familiar with,
  847. in a lot of situations those founders in their initial
  848. kind of creators of this, actually didn’t have a deep
  849. understanding of what this would become, right?
  850. And so, on one hand, how do you kind of a model for
  851. oh we’re gonna have this kind of impact.
  852. Because at a certain point when you’re just
  853. a handful of friends starting something
  854. that you think over, we’re gonna be able to influence
  855. three billion people in the world, that’s pretty–
  856. – Yeah.
  857. – Arrogant and in some ways crazy, right?
  858. – Yeah. – One or the other.
  859. And so how do you model for that?
  860. That becomes a difficult question.
  861. But then once you start approaching it from,
  862. should you be doing something for that, right?
  863. And that again raises an interesting and insightful
  864. question about this idea of how do you create
  865. the right structure of, for these platforms
  866. to police themselves?
  867. – Yeah and one of the things that we’re gonna talk
  868. about later on is access to algorithms.
  869. So as machine learning, AI and these other things
  870. become more prevalent, I mean we’re using these things
  871. every day, whether you realise it or not.
  872. But as they become more prevalent we have to understand
  873. these are literal black boxes
  874. where we can’t see the algorithm.
  875. I watched, do not do this, it’s a waste of time.
  876. I watched a 23 minute video today from a prominent YouTuber
  877. about the YouTube algorithm
  878. and how difficult it is as a YouTuber to understand
  879. that algorithm and to create viral content.
  880. But what he was saying is that the reason why
  881. YouTube is getting more and more clickbait-y
  882. as he said and why it’s more relying on the thumbnails
  883. more and more, which is why you’re seeing
  884. a lot of maybe irreverent pictures or things
  885. that really cause people to click on them
  886. is because the way that they created the algorithm,
  887. it really–
  888. – To reward that.
  889. – It really rewards the click-through rate.
  890. CTR.
  891. The click-through rate.
  892. And so he was saying that you have to, as a creator,
  893. you can’t be as thoughtful about those things
  894. because it kinda driving people in that direction
  895. based on that algorithm.
  896. Now we can’t see that algorithm.
  897. Right? This is one of those things.
  898. And so the question going forward
  899. is should we be able to?
  900. Should this be a public good?
  901. These are some of the questions that we’re going
  902. to address into the future.
  903. And so I think as we go forward again,
  904. really really appreciate these very thoughtful
  905. comments because these are the broad questions,
  906. these are the new social goods.
  907. These are the new commodities.
  908. And so as we move forward,
  909. as we think about how these companies should be operating,
  910. we really need to think through.
  911. Now one last point,
  912. sorry I know this is already running long.
  913. One last point is, when we talk about these billionaires
  914. that own and operate these massive technology platforms,
  915. one thing to consider which is totally new
  916. in this landscape, is that because of what is known as
  917. weighted voting rights, you now have people
  918. like Mark Zuckerberg who do not own a majority
  919. of the shares in their company anymore.
  920. So he does not own a majority of Facebook
  921. and yet he has almost complete control
  922. over what Facebook does, because every one
  923. of his shares has 10 voting rights
  924. versus if you own Facebook, you only get one voting right.
  925. – So different class of shares.
  926. – Different class of shares.
  927. And so as a result of that you have guys
  928. like Jeff Bezos, or the people that own Alibaba
  929. that have an immense amount of power and control
  930. not only over your data and privacy
  931. which we’re gonna talk about going forward,
  932. but literally the news content that we read every day.
  933. Right?
  934. The types of products that we see and buy.
  935. The type of news, there’s so many things.
  936. – And ironically I guess, the historical,
  937. at least the modern historical genesis
  938. of modern kind of weighted voting
  939. or multi-share class kind of structures
  940. was to give, allow media companies to give them
  941. a little bit more of editorial independence–
  942. – Yeah protection from shareholders.
  943. – Protection from being influenced from being,
  944. oh I don’t like you sharing this kind of truth
  945. and so allowing them to insulate themselves from that
  946. and now it’s maybe a little–
  947. – It’s gone full circle.
  948. – Kind of a little backwards.
  949. That’s interesting.
  950. I think the other thing that we’ve talked about in the past
  951. but which is incredibly relevant to what you’re talking
  952. about now in terms of the pervasiveness
  953. of some of these social media platforms,
  954. how they influence us.
  955. And we both use them a lot ourselves
  956. and so we’re not saying that by default
  957. they’re evil, but more that we should just be aware.
  958. I think, in my own world working with some technology
  959. companies and start-ups, it’s incredible
  960. and even if you talk to people who are on large
  961. social media platforms or look at how they describe
  962. the platform design, it’s intentionally created
  963. so people stay on it, right?
  964. – Yeah.
  965. – The user interface is very much a combination
  966. of human behaviour psychology as well as kind of
  967. graphic design and other things to make sure
  968. people stay on it and a lot of people
  969. are putting together these platforms now
  970. as part of their design is what can we do
  971. to somewhat manipulate people to stay on it.
  972. And this is part of the business model.
  973. – Yeah.
  974. – So this is important for us as consumers
  975. to be aware of that.
  976. – Yeah.
  977. Okay, if you get us talking we’ll talk forever.
  978. So we’re gonna end here, but we hope to see you
  979. in the blockchain modules so we can talk about
  980. how some of these new emerging technologies
  981. are gonna be impacting our lives, for good,
  982. maybe a little bit for the negative
  983. and how you as an innovator,
  984. as someone in the finance industry
  985. or just an interested party
  986. can utilise these technologies for your own life
  987. and for your own career.

Module 2 Blockchain and Its Governance

Module 2 Introduction

  1. Hi and welcome back to Module 2!
  2. Thanks for sticking with us.
  3. We promise it’s only going to get more interesting!
  4. In this module we are going to talk about blockchain, which is really one of the key
  5. catalysts for the rise of fintech.
  6. Now a few upfront caveats: The focus of this module is NOT which cryptocurrency you should
  7. invest in and if you have followed cryptocurrency markets, it has been particularly volatile,
  8. so as the cryptocurrency enthusiasts like to say, “Hodl”—hold on for dear life.
  9. Frankly, we don’t know which cryptocurrency you should invest your life savings in, so
  10. please don’t ask, and if we did know, honestly, we probably wouldn’t be doing this course,
  11. we’d be at a warm beach.
  12. So another caveat is that this module will also not discuss initial coin offerings- ICOs,
  13. IICOs, STOs, or any of the variants by which someone might try to fundraise or monetize
  14. for their blockchain project.
  15. Don’t get us wrong, these mechanisms are all interesting, but there is so much information
  16. to cover, it could easily be its own course, and because of changing laws and regulations
  17. in different jurisdictions, it’s difficult to explain in a snapshot format since the
  18. regulatory landscape is constantly changing.
  19. Maybe most importantly, though we are both lawyers, we’re not your lawyers, so if this
  20. is something you are thinking about doing as part of a blockchain project, please speak
  21. with your lawyer.
  22. Now given what we just said, the focus of our module is more about questions that might
  23. be good to consider as blockchain technologies become more pervasive.
  24. Really what are blockchain’s implications–both the wonderful disruptive possibilities that
  25. it represents as well as potential issues we should consider before completely embracing
  26. it.
  27. You’ll find we won’t focus too much on blockchain’s technical details in this module,
  28. basically for two reasons: 1) because blockchain and its applications
  29. continue to grow so rapidly, things will likely have advanced a bit between the time we prepared
  30. this module to the time you end up watching this; and more importantly;
  31. 2) the next course in the Fintech Certificate, which our Fintech Ethics course is also a
  32. part of, is entirely focused on blockchain and is taught by one a wonderful colleague
  33. of ours from the Faculty of Engineering at the University of Hong Kong, who is a real
  34. technical expert in the space.
  35. So if you find yourself with an increased interest in blockchain, please be sure to
  36. register for the next course, “Blockchain and Fintech”.
  37. Ad-libbed:
  38. So one quick question for you as we keep using these terms interchangeably; what is the difference
  39. between blockchain and cryptocurrency?
  40. That’s a great question David, and I think a lot of people sometimes use those interchangeably.
  41. Effectively, cryptocurrency is one of the outputs of the blockchain.
  42. So, as people mine – and we’ll talk about some of this vocabulary in a little bit in
  43. our course – as computers that are part of a blockchain network mine and solve problems
  44. to basically build on to additional blocks in the blockchain, coins are produced and
  45. part of that is to incentivize these miners to do the activity.
  46. So keep that in mind as we go through.
  47. So, sometimes the blockchain, which is a distributed ledger network can be used for all kinds of
  48. things; including tracking, certain types of goods or services, even people, whereas
  49. cryptocurrencies and various forms of cryptocurrency are specific for new forms of payment.
  50.  

2.1.1 What Is Blockchain Technology?

  1. In its most basic form,
  2. a blockchain is a distributed ledger,
  3. essentially a series of digital records,
  4. referred to as blocks,
  5. which are connected together forming a chain of records,
  6. hence blockchain.
  7. Instead of this data being kept in a single place though,
  8. the information is replicated and distributed
  9. across a peer to peer network of computers.
  10. The network collaborates together to confirm
  11. if new blocks of data can be added to the chain,
  12. which makes it difficult for a single member
  13. of the network to add incorrect information.
  14. Additionally, such a decentralised nature
  15. also makes the blockchain difficult to modify
  16. thus preventing tampering.
  17. Though a number of folks had researched
  18. and thought about blockchain
  19. and many of the cryptographic technologies
  20. that underpin it before,
  21. the concept of the blockchain and its offshoot – cryptocurrencies,
  22. really entered into the public domain
  23. after a white paper was published in 2008
  24. by Satoshi Nakomoto, titled:
  25. Bitcoin: A Peer to Peer Electronic Cash System.
  26. Now, the paper described bringing together
  27. various technologies and cryptographic methods
  28. to form the Bitcoin protocol
  29. and has gone on to serve as a framework
  30. for many of the subsequent blockchain related
  31. advances in the FinTech space.
  32. So who is Satoshi Nakomoto?
  33. Though there has been a lot of speculation,
  34. Satoshi Nakomoto is a pseudonym,
  35. and the general public really does not know,
  36. at least not yet,
  37. the identity of this person,
  38. or if Mr. Nakomoto is a single person or
  39. perhaps even a group of people.
  40. And even if we never figure out
  41. who Satoshi Nakomoto is,
  42. there is a real possibility that
  43. history will look back on his 2008 white paper
  44. as a seminal moment that fundamentally
  45. changed the course of history,
  46. or at least financial history.

2.1.2 How Is Blockchain Governed?

  1. When considering FinTech governance,
  2. especially for blockchain technologies,
  3. is lack of regulation a pro or a con?
  4. Blockchains are effectively regulated like industry groups,
  5. or even members only clubs.
  6. And the mechanism for governance
  7. is generally based on the
  8. principle of majority rule.
  9. But is majority rule always right?
  10. Now this is like straight back to the Greece, right?
  11. But the reality is that most modern democracies
  12. are not actually direct democracies
  13. where the simple majority always wins and governs.
  14. So this is why we think that Bitcoin and blockchain
  15. are simultaneously so appealing,
  16. and yet so threatening.
  17. Because of the one-person one-vote system idea
  18. is basically built into the code.
  19. And so whoever controls the majority,
  20. they also get to rewrite the rules.
  21. And your identity is typically quite anonymous,
  22. so it’s difficult to identify who the other actors are.
  23. And so these principles raise
  24. a whole host of interesting issues.
  25. Because as you think about particular
  26. blockchain protocols be it
  27. Bitcoins, Ethereum or other forms of
  28. widespread protocols that are gaining
  29. more and more types of different use cases,
  30. we could easily imagine a situation,
  31. where a particular protocol application
  32. becomes so widespread,
  33. and affects so many other people.
  34. Do we want that to be governed by the members
  35. who have the coins who can vote or
  36. should that be regulated on a more national
  37. or even international level?
  38. What process would you trust more?
  39. Now we’re not advocating that blockchain
  40. should be governed at a more national
  41. or international level,
  42. or have greater regulatory scrutiny per se,
  43. but it just raises the question:
  44. as these technologies are becoming more pervasive,
  45. is the current governance structure
  46. the way we want to deal with that?
  47. Especially if it is going to impact so many
  48. other people who are not necessarily part of
  49. the “member system”.
  50. If you consider voting from a corporate governance perspective,
  51. the concept of majority voting otherwise
  52. characterized as one share, one vote
  53. has long been the general rule.
  54. But while things definitely started that way
  55. the reality is that a whole host of
  56. diverse voting mechanisms have been adopted
  57. to ensure proper governance.
  58. For example, supermajority voting has been
  59. legally built into many aspects of the corporate world.
  60. An example of this would be a
  61. special resolution to change the name or
  62. nature of a company which would require
  63. a supermajority of the shareholder votes.
  64. Beyond that basic democratic majority or
  65. super majority voting rule
  66. is not always the most efficient way to decide something.
  67. Now we have things like accumulative voting
  68. or other different methods
  69. where like a minority shareholder or a voter
  70. could have a stronger influence or a voice
  71. on a particular matter.
  72. So if we apply this back to blockchain and cryptocurrencies
  73. at their genesis,
  74. we need to consider what the best way is,
  75. for us to manage them.
  76. Should there be a more comprehensive type of
  77. voting or control structure?
  78. Or do we really want a simple majority rule,
  79. and just give power to the people?
  80. These are the type of questions that are going to
  81. take some time to answer.
  82. We talked about governance and how some of these
  83. protocols are governed by users,
  84. and fundamentally we have to remember that
  85. blockchain seeks consensus first
  86. and not necessarily fairness or efficiency.
  87. And that could be right or wrong,
  88. it’s something we’ll have to consider in the future.
  89. But will blockchain and its uses
  90. create greater inequality in the long run?
  91. And if we jump ahead,
  92. will people that are already left behind be further left behind?
  93. One of the novel uses of blockchain
  94. is coupling it with something called a smart contract,
  95. which are not really smart
  96. and may not always actually even be a contract.
  97. So now that you’re probably confused,
  98. let’s talk about it.

Additional Readings

    • Orcutt, M. (2018). How Secure is Blockchain Really? MIT Technology Review. Retrieved from https://www.technologyreview.com/s/610836/how-secure-is-blockchain-really/
    • Yermack, D. (2017). Corporate Governance and Blockchains. Review of Finance. 21(1), 7–31, Retrieved from https://academic.oup.com/rof/article/21/1/7/2888422 

2.2.1 What Is a Smart Contract?

  1. The term “Smart Contract” sounds really
  2. exciting and futuristic, right?
  3. But hold your excitement,
  4. because the current form of smart contracts
  5. are neither smart nor even contracts.
  6. Computer scientist Nick Szabo,
  7. an influential figure in the blockchain and cryptocurrency world,
  8. is credited with initially coining the phrase
  9. “smart contract” as early as the mid-1990s.
  10. A smart contract is simply a computer protocol,
  11. really some lines of code that automatically
  12. execute a specified action,
  13. like releasing a payment,
  14. when certain conditions are fulfilled.
  15. So this code might represent an aspect of a contract,
  16. but the code itself is not actually a contract.
  17. Additionally, it’s not smart because a person still needs
  18. to think of the terms that will be represented by the code.
  19. So someone like a lawyer is still needed to think through
  20. and negotiate the terms to be coded.
  21. So if these smart contracts are actually not smart nor contracts,
  22. why are they so special?
  23. To answer that question,
  24. imagine you are cleaning out your room
  25. and find a tennis racquet you never used
  26. and now want to sell.
  27. You go online and are able to find a buyer,
  28. say David, that lives nearby.
  29. You set-up a meeting and show David the tennis racquet.
  30. David confirms his interest and then gives you the money
  31. and you hand over the tennis racquet.
  32. In this example, there is minimal risk that
  33. David will be able to run-off with the racquet
  34. without paying you.
  35. But let’s imagine the same situation
  36. except you live far away from each other,
  37. so you aren’t able to meet,
  38. do you feel comfortable sending the racquet through the mail
  39. and trusting David to pay you?
  40. Now this type of risk is usually less of an issue
  41. when dealing with large companies,
  42. like when you order a t-shirt from your favorite brand’s online store,
  43. or people you may have repeat transactions with,
  44. but for one-off situations or large, complicated transactions,
  45. like a home purchase,
  46. there can be some uncertainty about
  47. payment, delivery, quality of product, etc.
  48. In such a situation then,
  49. what if you can find a third-party,
  50. say Jon, to take the payment from David
  51. before you send the racquet,
  52. and you’ll get the payment from David
  53. when the racquet is received?
  54. Would you feel more comfortable?
  55. This is exactly how smart contracts work:
  56. using “if something happens then…”
  57. or “when something happens then…”
  58. type of logic to solve this problem.
  59. So in our example,
  60. if a specified contractual term, say racquet delivery,
  61. had been fulfilled,
  62. then the protocol would execute release of payment,
  63. thus solving the problem.
  64. So how does this relate to blockchain?
  65. With blockchain technology,
  66. these smart contracts can be stored or embeded on a blockchain,
  67. so instead of being visible to only the counterparties
  68. that may have a copy of the contract like
  69. in a traditional contracting situation,
  70. a smart contract is available widely for inspection
  71. on the blockchain.
  72. In the example of selling your racquet,
  73. not only you, David and Jon know about the contract,
  74. it is also visible to the bank who processes David’s payment,
  75. and the delivery guy who delivers the package,
  76. and every other actor that’s involved in this transaction,
  77. or has access to the blockchain in general.
  78. The distributed nature of the blockchain
  79. makes it difficult for a bad actor to not pay, delay payment,
  80. manipulate terms, or otherwise deviate from the terms
  81. of the original agreement because
  82. the terms are recorded across the network
  83. and cannot be changed.
  84. And once they are fulfilled then payment is self-executing
  85. and happens automatically.
  86. Which means, when the blockchain tracks that the racket is received,
  87. the money will be sent to your account automatically.
  88. So what are the benefits of a smart contract?
  89. Well some things that maybe come to mind are:
  90. One, these things don’t require human interpretation,
  91. hence taking out some human error.
  92. The reason for that is because they’re self-executing.
  93. So there’s no issues with
  94. a human doing something incorrectly,
  95. as part of processing a contract,
  96. or it removes some of the temptation that,
  97. maybe someone feels of like:
  98. well if I keep my end of the deal then
  99. I end up being worst off.
  100. So it removes this human temptation issue.
  101. Additionally, once a smart contract is coded in,
  102. generally it can’t be changed,
  103. so it’s immutable.
  104. Now because of those factors,
  105. this ultimately should save time and money,
  106. thus making things more efficient
  107. and reducing transactional friction.
  108. Additionally, if we tie this
  109. back into the tennis racquet example,
  110. it removes the need for a third party.
  111. You see for lots of transactions historically,
  112. a third party has been necessary
  113. to hold payment or collateral
  114. due to risk related to a lack of trust,
  115. which is something we’ve talked about.
  116. Now perhaps the most common form of
  117. this type of third party is something known
  118. as an escrow agent.
  119. Now imagine that instead of buying a tennis racquet
  120. a US company is trying to purchase a big building in
  121. another country, say, China.
  122. They do not know each other,
  123. and they cannot meet somewhere with a pile of cash
  124. to make the payment and sign the deed
  125. at the same time.
  126. So the two contracting parties may enter into this
  127. staring contest of “who’s going to pay first?”
  128. or “who’s going to act first?”.
  129. In this situation, an escrow agent would serve as the third party,
  130. or a middle party:
  131. on one hand holding the payment from the US company,
  132. and on the other hand holding the signed deed or legal agreement
  133. from the building owner.
  134. And once the two parties agree to pay and finalize the terms of the transaction,
  135. the agent will transfer the money and the deed simultaneously,
  136. so ensuring the building owner will get their money,
  137. and the building purchaser will receive the legal title and the relevant documents,
  138. so they can own the building.
  139. As you can see smart contracts would serve
  140. the purpose of cutting out the middle party
  141. be it Jon in the tennis racquet example,
  142. or the escrow agent in a large international real estates transaction.
  143. And as we previously discussed,
  144. a lot of time and money can be saved by
  145. cutting out the middle men.
  146. But does that mean smart contracts are great solutions
  147. for all contracting relationships or situations?
  148. The answer to that is “no”,
  149. and we’ll discuss why that is in the next video.
  150. But before that, we’d like you to think about a question:
  151. what are the situations in which a smart contract
  152. may make your life easier?

Additional Readings

  • Paech, P. (2018). Law and Autonomous Systems Series: What is a Smart Contract? Oxford Law Faculty. Retrieved from https://www.law.ox.ac.uk/business-law-blog/blog/2018/07/law-and-autonomous-systems-series-what-smart-contract

2.2.2 Applications of Smart Contract

  1. So, as we discussed about smart contracts,
  2. we mentioned that smart contracts may not
  3. be the solution for every legal problem.
  4. Definitely.
  5. So why is that?
  6. Because I think people think it’s “smart”,
  7. it should just evolve and it’ll be okay,
  8. but that’s probably not the case.
  9. So why is that?
  10. Well, so like you mentioned,
  11. smart contracts have been around,
  12. the concept has been around, since the 90s,
  13. and yet, the vast majority of people
  14. don’t know what it means
  15. or have never actually used one before,
  16. because in reality smart contract is really hard.
  17. Basically smart contracts are typically a binary solution,
  18. “if this then this”,
  19. it is really much like computer programming.
  20. And, it would be a legal situation if I tick off
  21. all these boxes, then you are automatically
  22. going to remit the funds or transfer the deed
  23. or whatever it is that is the outcome of that,
  24. that contract.
  25. But if it is not a situation where you can just
  26. tick off those boxes and have like
  27. you know, “if this then this” type of solutions,
  28. which most legal situations are not like that,
  29. which we know.
  30. Then the smart contract is very,
  31. very difficult to include.
  32. I think maybe as AI becomes better and
  33. machine learning gets better,
  34. then maybe it will be able to get on to the periphery
  35. and deal with those grey areas a little bit better,
  36. but until then, smart contracts are going to be relegated
  37. to very simple, very rote types of,
  38. “if this then this” type contracts.
  39. Interesting, so,
  40. I think there’s two things that are
  41. really interesting about that.
  42. One is, the idea that a smart contract is kind of
  43. like an oxymoron,
  44. in the fact that it actually is not that smart to be frank.
  45. Like an honest lawyer.
  46. Just kidding.
  47. But you know, secondly,
  48. I think the point about that,
  49. the applications of smart contracts will probably
  50. be very applicable to the routine and mundane.
  51. Potentially.
  52. Well,
  53. not to say not important, but just to say:
  54. So if you and I are buying, let’s say I’m buying
  55. a building from you, and you are in Seoul,
  56. and I am here in Hong Kong
  57. – there’s a lot of variables in that.
  58. Right, so, I have to do my due diligence,
  59. to look at past history, understand potential legislation,
  60. I have to look at the foundation,
  61. I have to look at utilities, I have to look at mortgages.
  62. All these other things.
  63. So, typically when you enter into a contract
  64. that’s complex like that,
  65. it will have conditions precedent and all these things.
  66. So really quick Dave,
  67. so we understand what that means,
  68. but what is a condition precedent?
  69. It means like it is a condition that precedes the closing.
  70. So if we enter into a contract, we sign it,
  71. but I’m not going to give you the money yet
  72. – and you are not giving me the deed yet.
  73. Instead, we have to go down a list and confirm
  74. every single thing has been done.
  75. Right, so, I’ll usually get a few months.
  76. I’ll look, okay, is the foundation solid?
  77. Yes.
  78. Do my engineers like it?
  79. Yes.
  80. Research litigation, is there any litigation history?
  81. No.
  82. Right, so then after you tick of all those boxes,
  83. then you agree to finally give them the funds
  84. and you transfer the deed to me.
  85. It’s simple, but it is obviously very complicated
  86. – because life is complicated.
  87. I think it’s interesting that you talk about the
  88. idea of the complexity of life.
  89. Because I think what we are talking about really is:
  90. any time there is some major qualitative assessment
  91. that’s necessary, then it’s going to be very difficult for
  92. a smart contract to really be applied to that.
  93. It’s where those variables are really minimal
  94. or not existent – or it’s very vanilla – about
  95. “Okay, this is what needs to be done,
  96. this is what you need to do”,
  97. and those responsibilities are very clearly defined
  98. that we can rely on smart contracts.
  99. And here’s the interesting thing
  100. that a lot of people don’t think about
  101. when they think of contracting.
  102. You legally have the right to breach.
  103. Right so when you enter a contract,
  104. and there’s ethics issues in there, and obviously you
  105. want people to fulfill the agreement
  106. – but you always have the right to back away.
  107. Now, there is legal ramifications for that.
  108. If you stop paying your mortgage,
  109. they can take your house.
  110. You could pay a fine.
  111. Yeah exactly, pay a fine, whatever,
  112. but the point is,
  113. if there is some underlying condition where I need to
  114. stop paying my mortgage,
  115. I have the right to do that.
  116. Within a smart contract,
  117. you don’t have that option generally speaking because
  118. it is again –
  119. – upon the conditions being fulfilled, it is self-executing –
  120. it executes automatically.
  121. Right so when they say smart,
  122. what they mean is that it does not require
  123. human intervention to execute and
  124. fulfill the terms of that agreement.
  125. But it’s like a roller coaster.
  126. Once you are going down the hill,
  127. there’s no pulling back,
  128. you’re kinda stuck with that ride.
  129. So, there’s a level of commitment that’s required
  130. if you go down this route.
  131. Which is why I don’t think you are going to see
  132. any time soon any type of complex transaction
  133. where people are using smart contracts.
  134. Everybody wants to be able to get to the end of the
  135. line in that roller coaster analogy
  136. – they want at the very-last moment,
  137. they wanna say: you know what,
  138. I don’t want to get on this ride.
  139. Even if that means they have to pay a fine.
  140. Even if they have to pay a breach fee or something.
  141. I need to get off this ride.
  142. And I think for a lot of companies
  143. and a lot of transactions, they need that.
  144. Yeah, I think you’re right.
  145. And I think for complex type of transactions,
  146. you’re right.
  147. I don’t think, the use of smart contracts
  148. won’t necessarily proliferate in the near-term at least.
  149. But, I do think there’s a wide variety of
  150. daily contracting that we just normally do,
  151. that could really be applicable perhaps to this.
  152. I mean, probably the most complex version
  153. that would be just a home purchase to be honest.
  154. If you got the right documentation done up-front
  155. then you could potentially find a very efficient
  156. smart contract to deal with escrow
  157. and things like this.
  158. But I think it’s an interesting thing that
  159. a lot of people, both lawyers and technologists,
  160. who are continuing to explore a really important part of
  161. this FinTech ecosystem that people are trying to create.
  162. Yeah, IoT, right? The Internet of Things.
  163. With wearables and things.
  164. I can see for example, say health insurance policy,
  165. where there’s a smart contract tied to that,
  166. where if you exercise a certain amount number of days,
  167. or if you use certain things,
  168. then your policy comes down.
  169. Yeah exactly. If you drive…
  170. So they are already doing this with cars,
  171. right, they’ll put a device to measure your speed
  172. and everything on your car as long as you’re
  173. a safe driver,
  174. then your insurance premium comes down.
  175. A lot of those aren’t officially smart contracts yet,
  176. but you could see the method. Totally.
  177. You get the big data analytics,
  178. you get the AI and machine learning on the backend of
  179. that.
  180. It makes it very easy for that to be executable.
  181. Like a thousand little contracts. Basically.
  182. So the lesson I take from that, before we move on is,
  183. if and when that happens,
  184. and wearables are reporting to my
  185. health insurance provider, and that will impact
  186. my health insurance premium,
  187. and, I will purchase a dog, and we’re gonna
  188. put a wearable on the dog
  189. and let the dog run around.
  190. People are actually doing that already!
  191. So there you go.
  192. Just kidding.
  193. That was a joke!
  194. This is the ethics side of it. That was a joke.
  195. We also have humour in our modules as well.
  196. Thanks.

Additional Readings

  • Levi, S. D., & Lipton, A. B. (2018). An Introduction to Smart Contracts and Their Potential and Inherent Limitations. Harvard Law School Forum on Corporate Governance and Financial Regulation. Retrieved from https://corpgov.law.harvard.edu/2018/05/26/an-introduction-to-smart-contracts-and-their-potential-and-inherent-limitations/

2.2.3 Implications of Blockchain Technology

  1. Well, blockchain sounds awesome, right?
  2. so what’s the problem?
  3. Well really no problem per se,
  4. but let’s consider some questions:
  5. Okay so first, from a business perspective
  6. blockchain is just another type of technology, but it’s not
  7. a panacea to all business problems.
  8. So it’s important that you have the type of
  9. business problem that lends itself
  10. to a blockchain solution.
  11. Now moving beyond that though there are other
  12. implications to consider.
  13. Blockchain has an impact on the environment,
  14. for example.
  15. Remember when we mentioned that blockchain
  16. is a distributed network,
  17. each node on the network is a computer that
  18. requires electricity.
  19. Each of those computers is engaged in “mining”
  20. —effectively solving complicated mathematical
  21. problems to add blocks to the chain.
  22. These mining rigs require lots of electricity
  23. to both run the computers but also for the
  24. cooling to prevent the computers from overheating.
  25. So I have students that have a spare laptop or
  26. computer in their dorm room,
  27. and they have downloaded mining software
  28. and use electricity in their dorm 24 hours to mine, now,
  29. albeit mine very inefficiently,
  30. and they think the electricity is free,
  31. but of course that comes at a cost.
  32. So, for someone who’s layman,
  33. someone who’s not a technologist,
  34. you keep using the term mining, what does that mean?
  35. Basically, computers have to calculate a series
  36. of very complicated mathematical problems
  37. in order for them to be approved to add an
  38. additional block or information to this “blockchain”,
  39. and this is a level of security and access,
  40. a barrier to access,
  41. to prevent people from just adding things
  42. ad hoc onto the blockchain.
  43. Now, the ramification of this, however, is
  44. that takes an incredibly amount of computing power
  45. and will continue to take more
  46. and more computing power.
  47. And this is not just for Bitcoin,
  48. which is maybe the oldest or most well-known of the
  49. different types of cryptocurrency
  50. and blockchains out there.
  51. But for all the other different types of
  52. blockchain projects that have sprouted up,
  53. each of them requires some level of what they call
  54. mining.
  55. So now we have these huge mining rigs or farms
  56. out in random places in the world that
  57. – all they do is have these banks of computers
  58. that are basically calculating these series
  59. of mathematical problems in order to add more
  60. and more blocks to whatever blockchain they
  61. are working on.
  62. Okay so back to the environment,
  63. and the implications for that,
  64. you may be surprised to learn that to mine Bitcoin,
  65. a 2017 estimate stated that mining Bitcoin exceeded the
  66. electricity production of 159 other countries individually.
  67. So they say 30 Terrawatt hours – whatever it means.
  68. It means a lot of electricity.
  69. Right.
  70. And that’s only for Bitcoin,
  71. you can see that the electricity consumption would be
  72. much, much higher if it also included
  73. the mining of other types of blockchain.
  74. I realize this is a FinTech ethics course,
  75. so why should we be talking about the environment?
  76. So, from my standpoint, this is a really interesting
  77. and super, super important point that many of you
  78. maybe don’t really think about.
  79. When we talk about the implications of these
  80. technologies, in this course,
  81. or if you’re just reading about them,
  82. we are typically talking about the person-to-person
  83. kinda transactional cost that they maybe have.
  84. So, loss of privacy or access to finance,
  85. and those are super, super important.
  86. But what we also have to think about,
  87. and what we hope that you think about,
  88. is the broader social and physical – even geographical –
  89. implications of these things.
  90. When we include technologies like this,
  91. when we introduce these technologies, again,
  92. getting back to this concept of cultural lag,
  93. the technology has far outpaced our understanding
  94. of how to really deal with that technology
  95. in our real lives – in terms of its
  96. implications for the natural environment.
  97. So there’s good and bad examples of this.
  98. So, in some places, in Canada for example
  99. I’ve read they are taking old abandoned sawmills
  100. and lumber industry that has been shut down,
  101. or whatever, and they are retrofitting and re-fitting
  102. those large facilities into mining farms which
  103. – for some people is good – means more jobs,
  104. maybe brings income in there.
  105. But there’s a lot of negative ramifications as well.
  106. So again, noise pollution is very serious,
  107. so a lot of people in those communities are complaining
  108. about the noise.
  109. There’s obviously the electricity consumption.
  110. So the vast majority of mining farms are in China,
  111. and the vast majority of electricity
  112. from China comes from burning coal.
  113. And so there are very serious ramifications
  114. both now and in the future for things like that.
  115. But on the flipside,
  116. as culture kinda catches up to the technology,
  117. we are also starting to think more creatively about
  118. where to put these large institutions.
  119. So for example, again, in Canada,
  120. I’ve heard, I haven’t really seen this in use but I’ve heard,
  121. that they are actually trying to use the heat that is
  122. generated from these mining computers and to heat
  123. industrial complexes or maybe even other types of
  124. buildings and homes. So.
  125. Okay, so there’s some mixed uses or mixed purposes
  126. of having these locations based on these farms.
  127. Yeah, from a technology or FinTech ethics standpoint,
  128. I’m curious on your perspective,
  129. should a technologist or inventor or whatever, a bank,
  130. should they have to, or should they even want to,
  131. think about the environment implications
  132. or should they just be focused on the technology
  133. in the business model that they have?
  134. So that’s a great question and I think,
  135. a really fundamental question that we shouldn’t just be
  136. considering in our course,
  137. but in a lot of different domains to be frank.
  138. I think,
  139. I think science is telling us that we are at the precipice
  140. of some really fundamental changes that are happening
  141. to the world, well, have been happening to the world,
  142. and I think if we try to silo ourselves off and say
  143. – what I’m doing is not directly related to that –
  144. I think we can collectively find ourselves in a place
  145. that we didn’t intend to be.
  146. So, I think, irrespective of industry, I think the impact that
  147. industry is having on the environment
  148. is important to consider.
  149. Yeah, there are things that we are gonna talk about
  150. in later modules.
  151. You’ll hear us talk about
  152. some positive uses of this technology.
  153. So again, it’s not just about currency.
  154. Blockchain can be used to track diamonds,
  155. to make sure that they are not conflict-diamonds
  156. or blood diamonds as they are sometimes called.
  157. To track people in terms of either people that don’t have
  158. a government ID like refugees or people that perhaps
  159. have been – migrant workers,
  160. who potentially are at risk of human trafficking
  161. and slavery.
  162. So, there are a lot of very positive utilization
  163. of this technology and even utilization that
  164. has nothing to do with currency whatsoever.
  165. But I think one of the things that we – you know,
  166. as David was just mentioning is
  167. we want you to constantly think about:
  168. what is the balance between introducing
  169. these new technologies and the positive ramifications
  170. for change, the disruption as Silicon Valley would say,
  171. to these markets, to the financial industry.
  172. But, what are also the unintended and possibly
  173. negative consequences that
  174. can come from these things?
  175. Not only now, but in the future,
  176. because if we are not thinking about those things,
  177. then by time we get to that point
  178. and we see them right in front of us it might be too late.

Additional Readings

  • Hernandez, C. (2019). Bitcoin Has Saved My Family. The New York Times. Retrieved from https://www.nytimes.com/2019/02/23/opinion/sunday/venezuela-bitcoin-inflation-cryptocurrencies.html
  • Popper, N. (2018). A Field Guide to the Hurdles Facing Blockchain Adoption. The New York Times. Retrieved from https://www.nytimes.com/2018/06/27/business/dealbook/-blockchain-adoption-faces-hurdles.html

2.3.1 Applications of Blockchain Technology

  1. So, now that we have a better understanding
  2. of what blockchain is and some general idea
  3. of its possible uses, it’s probably becoming
  4. clearer why people are both excited and concerned
  5. about the technology.
  6. From a trust and accountability standpoint,
  7. the anonymous nature of blockchain means that
  8. user data and privacy are better protected,
  9. at least within the system.
  10. But in an ironic twist, blockchain based markets
  11. are also where stolen customer personal data
  12. is bought and sold, because law enforcement
  13. often have trouble identifying the parties involved.
  14. And in a more commercial context,
  15. some are concerned that the unaccountable structure
  16. from blockchain based products, like ICOs for example,
  17. leave investors, and even in some cases
  18. the public at large, vulnerable.
  19. I think we are only just beginning to understand
  20. the incredibly beneficial aspects of blockchain technology.
  21. But from a cultural lag perspective,
  22. we also realize that we probably don’t yet understand
  23. the full extent of the challenges that will arise from its use.
  24. So let’s look further at a few examples of how
  25. blockchain can be used for both good and bad.
  26. First we will discuss a really exciting case about
  27. the dark web marketplace Silk Road,
  28. which used blockchain, and in particular, Bitcoin
  29. to create one of the largest marketplaces
  30. for illegal goods the world has ever seen.
  31. This Silk Road marketplace was like eBay or Amazon,
  32. but for illegal drugs and weapons.
  33. How could such a marketplace exist, you might be asking?
  34. Well it was hidden on the Dark Web.
  35. So before we get into the case, let’s first take a moment
  36. to discuss what the Dark web is.

2.3.2 Dark Web and Tor

  1. Think of the internet as an iceberg in the ocean.
  2. The part that is visible to you and I,
  3. is the “surface web”,
  4. which consists of the indexed pages on the internet,
  5. such as Google, or things you might find on Amazon and Facebook.
  6. Then there’s the deep web.
  7. The deep web is a subset of the Internet
  8. consisting of pages that can’t be indexed by
  9. search engines like Google or Bing.
  10. Pages that require membership falls under this category,
  11. so like online banking, your company intranet,
  12. and the very page that you are watching this web lecture on.
  13. Then, there’s the dark web, also called the “dark net”.
  14. This is a further subset of the “deep web”.
  15. None of the content can be accessed via a normal Internet browser,
  16. instead, you need a special cryptographic software,
  17. such as The Onion Router, also known as Tor.
  18. Tor is a free software, initially created by
  19. the US Department of Defense and the US Navy
  20. in the 1990s for the purpose of secured communications.
  21. The name itself is the analogy of an onion with lots of layers,
  22. layers upon layers – as it offers anonymous access
  23. to online resources by passing user requests through
  24. multiple layers of encrypted connections.
  25. Therefore, you can think of the software essentially as
  26. a digital invisibility cloak,
  27. hiding users and the sites that they visit.
  28. And it is this anonymity of the dark web,
  29. coupled with blockchain’s relatively anonymous
  30. and decentralized nature, that laid the foundation
  31. for the infamous marketplace Silk Road, which
  32. we’ll introduce next.

Additional Readings

  • Greenberg, A. (2014). Hacker Lexicon: What is the Dark Web? Wired Magazine. Retrieved from https://www.wired.com/2014/11/hacker-lexicon-whats-dark-web/ 
  • Hackett, R. (2018). From Crimefighter to ‘Crypto’: Meet the Woman in Charge of Venture Capital’s Biggest Gamble. Fortune. Retrieved from http://fortune.com/longform/crypto-vc-andreessen-horowitz-kathryn-haun/

2.3.3 Case Study – Silk Road

  1. In February 2011, Ross Ulbricht,
  2. under the pseudonym Dread Pirate Roberts,
  3. created the website platform Silk Road,
  4. where people could buy anything anonymously
  5. and have it shipped to their home
  6. without any trails linking back to the transaction.
  7. Named after the historical trade route that
  8. connected Europe to East Asia, Ulbricht founded
  9. Silk Road with the desire to create a marketplace
  10. free from taxation and government.
  11. The clandestine online marketplace, was largely
  12. made possible by the combination of widespread
  13. adoption of bitcoin and the invisibility of the Dark Web.
  14. Combining the anonymous interface of Tor
  15. with the traceless payments of digital currency bitcoin,
  16. the site allowed drug dealers and customers
  17. to find each other in the familiar
  18. realm of ecommerce.
  19. It functioned like an anonymous Amazon for
  20. criminal goods and services.
  21. Silk Road gradually developed to look similar
  22. to traditional web marketplaces
  23. with user profiles, reviews and more.
  24. And what started out focusing on drugs,
  25. soon included other products, such as firearms.
  26. And, although the authorities were aware of
  27. the existence of Silk Road
  28. within a few months of its launch,
  29. it would prove challenging to
  30. crack down the website and reveal the true identity of its founder,
  31. Dread Pirate Roberts.
  32. In June 2013 the site reached nearly
  33. 1 million registered accounts.
  34. Thousands of listings featured all kinds of drugs,
  35. prescription medication, weapons and more,
  36. turning its founder, the 28-year old libertarian,
  37. into one of the world’s biggest drug kingpins.
  38. From its launch on February 6, 2011 until
  39. July 23, 2013, over 1 million transactions had
  40. been completed on the site,
  41. totalling a revenue of almost 10 million Bitcoins
  42. and about 600,000 Bitcoins in commission.
  43. That involves, like 150,000 buyers and 4,000 vendors.
  44. At Bitcoin exchange rates in September 2013
  45. that was equivalent of 1.2 billion USD in
  46. revenue and 80 million USD in commission.
  47. In early 2013 a New York-based FBI team,
  48. Cyber Squad 2, had started their
  49. investigation of Silk Road.
  50. They were trying to crack the encrypted
  51. Tor network that Ulbrich was hiding behind.
  52. And like other law enforcement agencies,
  53. they were having a hard time.
  54. Even using undercover agents to
  55. try to get access to Ulbricht
  56. but they were all struggling to break the case open.
  57. Finally, through a warning note on Reddit,
  58. the cyber squad was able to find a code which
  59. was leaking an IP address,
  60. pointing to a facility in Reykjavik, Iceland.
  61. This further enabled them to create
  62. a replicate of the entire Silk Road’s system
  63. allowing them to see everything and
  64. Dread Pirate Roberts’ every move.
  65. They read through his chat logs,
  66. followed the main bitcoin server showing all vendor transactions,
  67. and even learned how he had ordered
  68. several assassinations on people who
  69. had tried to blackmail him.
  70. Eventually, an IRS investigator was able to connect
  71. Dread Pirate Roberts to Ulbricht,
  72. through an old post on an open forum
  73. where Ulbricht had asked a question
  74. about the encryption tool, Tor.
  75. Through that question Ulbricht’s personal email was revealed,
  76. which showed his full name.
  77. So, what happened next was straight out of a movie.
  78. While Ulbricht was in a public library in San Francisco,
  79. agents from the US government
  80. distracted him by staging a fight.
  81. And when he turned away looking at them,
  82. other agents grabbed his laptop
  83. and were able to secure the information –
  84. connecting Dread Pirate Roberts to his account.
  85. On the computer they secured a mountain of evidence.
  86. A list of all the Silk Road servers,
  87. 144,000 bitcoins,
  88. which at the time was worth more than US$20million,
  89. a spreadsheet showing Silk Road accounting,
  90. and diaries that detailed all of Ulbricht’s
  91. hopes, fears and aspirations.
  92. As a result of all this,
  93. Silk Road was shut down and Ulbricht,
  94. the pioneer who opened the door
  95. for drug sales to flourish in cyberspace,
  96. was subsequently sentenced to double lifetime in prison.
  97. In court, the Judge echoed that
  98. what Ulbricht did was unprecedented
  99. and in breaking that ground as the pioneer,
  100. he had to pay the consequences.
  101. Anyone who might consider doing something similar,
  102. needed to understand clearly that
  103. there would be serious consequences.
  104. And since then, similar marketplaces have been launched
  105. all over on the dark web.
  106. Some have outright just stolen their users’ bitcoins,
  107. others have been successfully shut down by law enforcement,
  108. but still some others operate in some corner of the dark web
  109. although none to the sheer magnitude of the Silk Road.

Additional Readings

  • Weiser, B. (2015). Ross Ulbricht, Creator of Silk Road Website, Is Sentenced to Life in Prison. The New York Times. Retrieved from https://www.nytimes.com/2015/05/30/nyregion/ross-ulbricht-creator-of-silk-road-website-is-sentenced-to-life-in-prison.html 
  • United States of America v. Ross William Ulbricht (2014). https://web.archive.org/web/20140220003018/https://www.cs.columbia.edu/~smb/UlbrichtCriminalComplaint.pdf 
  • The Untold Story of Silk Road, Part 1. Wired Magazine. Retrieved from https://www.wired.com/2015/04/silk-road-1
  • The Untold Story of Silk Road, Part 2. Wired Magazine. Retrieved from https://www.wired.com/2015/04/silk-road-2/
  • Roberts, J. J. (2018). Cryptocurrency Scams Are Now Among the SEC’s Top Enforcement Priorities. Fortune.  Retrieved from http://fortune.com/2018/11/02/sec-ico-report-cryptocurrency-scams/

2.3.4 Case Study – Silk Road: Subjectivity of Ethics

  1. Now I love this case, because it’s like
  2. straight up out of a movie, right?
  3. This 28-year-old guy, who’s seemingly very, very normal.
  4. Yeah.
  5. Neighbours didn’t have any idea what was going on,
  6. was leading, in many ways, what was considered
  7. one of the largest marketplaces for illegal behaviour
  8. that the world had ever seen before,
  9. making, you know, in total commerce
  10. was worth billions of dollars.
  11. So what do we learn from this?
  12. Well this is, that’s an interesting question.
  13. I think, there are, people get arrested all the time,
  14. right, for doing illegal behaviour,
  15. selling drugs, you know, all this kind
  16. of very similar things that Mr. Ulbricht
  17. has been charged with and convicted for,
  18. but what’s so special about this case,
  19. as it relates to ethics in general,
  20. but particularly your FinTech ethics in particular?
  21. Hmm, well a couple of things that immediately come
  22. to mind are the fact because the nature
  23. of these types of crimes, related to FinTech,
  24. has become increasingly cyber,
  25. the ways that law enforcement now have to police
  26. these crimes is also becoming increasing cyber,
  27. so a lot of the tools that Mr. Ulbricht
  28. and other people that were within that marketplace
  29. that they utilised in that dark web,
  30. law enforcement actually did use those same tools,
  31. right?
  32. So when they go undercover, for example,
  33. they’re not literally going undercover
  34. where they’re changing they’re identity
  35. or the way that they look,
  36. but they’re creating user names
  37. and other things, profiles,
  38. so that they can kind of infiltrate those market spaces,
  39. which again, could be remotely
  40. from somewhere in Wisconsin, for example.
  41. Sure.
  42. And talking to them in San Francisco
  43. or wherever he was.
  44. And you know, personally,
  45. in some of the law enforcement work that I’ve done,
  46. it’s the same thing, right?
  47. So a lot of the investigative work
  48. that you’ll do is now sitting in front
  49. of a computer trying to put together
  50. financial documents and transactions
  51. and things to kind of identify
  52. where the various actors are.
  53. The second thing that immediately stands out is,
  54. from an ethics standpoint,
  55. I find this really, really fascinating,
  56. because here, his stated mission
  57. was actually moral in nature, right?
  58. So Mr. Ulbricht was essentially trying
  59. to create a marketplace, so he’s libertarian, right?
  60. And believes that the government
  61. and regulation is inherently evil in some ways.
  62. That is what he claims, and so he wanted
  63. to create a marketplace that was free of these types
  64. of restrictions. – Government interventions.
  65. Exactly, right?
  66. And the way that he described
  67. that the government should not have a monopoly
  68. on violence, for example, especially in terms of
  69. drug trafficking and whatnot.
  70. He believed, or least he claimed to believe,
  71. that this type of online marketplace would
  72. actually be inherently more ethical and more moral
  73. than the violence that occurs every day
  74. with drug trafficking into, say, the United States, right?
  75. And it think this goes back to the subjectivity
  76. of ethics and why it’s so difficult to have
  77. kind of a global or even consistent dialogue
  78. concerning what is actually ethical,
  79. and I think it’s gonna become
  80. increasingly hard in terms of the transnational
  81. and global nature of these types of,
  82. not only crimes, but just commerce.
  83. Yeah, so those are really interesting questions for me.
  84. I think kind of piggybacking a little bit
  85. on some of things that you’ve said,
  86. I thought this was really fascinating
  87. because there have been large-scale drug,
  88. you know, sellers in the past,
  89. so that actual aspect of the crime itself
  90. is not necessarily too unique historically,
  91. but the fact that he was able to rely on cryptocurrencies,
  92. particularly Bitcoin, to facilitate
  93. the transactions. – Yeah.
  94. – In the completely digital space,
  95. which created the safety that you’re talking to,
  96. and raises questions of anonymity,
  97. and privacy, and the use cases
  98. of certain aspects of this technology,
  99. which I think is also a component, as well,
  100. and is worth considering in the context of our course.
  101. Yeah, it’s only a matter of time,
  102. I guess, before the next iteration of narcos,
  103. or whatever it is gonna be, the Bitcoin,
  104. or Silk Road version of this where they’re gonna have
  105. to explain how this entire kind of network
  106. of illegal behaviour has kind of gone crypto.
  107. Hmm.

Additional Readings

  • Ethics Guide: Subjectivism. (2014). BBC. Retrieved from http://www.bbc.co.uk/ethics/introduction/subjectivism.shtml 
  • Simonite, T. (2016). The Surprising Light Side of the Dark Web. MIT Technology Review. Retrieved from https://www.technologyreview.com/s/601073/the-surprising-light-side-of-the-dark-web/

2.3.5 Case Study – Silk Road: Cultural Lag

  1. – Okay, so getting back to the conversation
  2. about Silk Road are there certain aspects of how
  3. these technologies are being utilised
  4. that kind of bode, or can help us understand
  5. what they’re gonna look like going forward?
  6. And, let me just relate it back to one
  7. of the key principles we’ve been talking about
  8. is the idea that once these technologies are out
  9. it’s very, very difficult to pull them back.
  10. And, there’s often this slippery slope
  11. kind of race to the bottom perspective.
  12. So, if you look at it from a kind of a corporate
  13. regulation standpoint companies were created
  14. and then later on because people were seeking privacy
  15. they would go out to these island nations,
  16. the Cayman Islands, the BBI, and then they
  17. would kind of outbid each other by trying to be
  18. more private and providing less information.
  19. And so, again, while that attracted
  20. a lot of legitimate business that also kind of
  21. increased the opportunity–
  22. – To abuse the system.
  23. – Yeah, legal forms of abuse that have created
  24. problems globally now.
  25. So, are we seeing this, are there other examples
  26. of this that kind of predict what this
  27. is gonna look like in the next iteration?
  28. – So, I think we’ve already seen at least one iteration,
  29. post Silk Road and one of the things we mentioned
  30. is that one of the kind of drivers of allowing
  31. Silk Road to kind of grow to the size that it was
  32. was the use of Bitcoin as the medium of transaction.
  33. And, we believe Bitcoin and these kind of cryptocurrencies
  34. provide some of level of anonymity,
  35. though the block itself, the ledger itself is exposed.
  36. And, people can see the transactions that are happening,
  37. the actual users themselves have some level of anonymity
  38. as opposed to you using your credit card
  39. and being able to immediately identify who you are.
  40. And, maybe the next well know version of this
  41. is Monero, which is another cryptocurrency,
  42. an alt coin, an alternative cryptocurrency
  43. that has developed, has grown quite rapidly
  44. the last few years.
  45. And, one of its key characteristics is that
  46. it’s even more anonymous than other cryptocurrencies.
  47. – That slippery slope.
  48. – Again, there’s potential slippery slope there.
  49. And, we see this, and perhaps one example
  50. an example in North Korea which it reportedly
  51. is used, Monero maybe potentially mining Monero
  52. to circumvent transactions in the international
  53. financial system because they’re subject to
  54. a variety of UN sanctions and restrictions
  55. from accessing traditional financial markets
  56. at the moment.
  57. And, one way they are perhaps circumventing that
  58. or trying to get around those is the use
  59. of these kind of more secretive,
  60. less accessible forms of cryptocurrency
  61. such as Monero.
  62. And, there’s a lot of reports that
  63. they’re using that as well.
  64. – Okay, so Bitcoin was utilised within Silk Road
  65. primarily because it was largely anonymous.
  66. But now, we’re seeing people leaving Bitcoin
  67. to go to something like Monero because
  68. it’s even more anonymous. – Potentially more anonymous.
  69. – Potentially, and now we’re seeing governments
  70. getting in on the game.
  71. And, these are governments that oftentimes
  72. are maybe within–
  73. – Maybe less mainstream.
  74. – Yeah, less mainstream, oftentimes kind of tied
  75. to say terrorist financing or other kind of globally
  76. sensitive political topics.
  77. I find it somewhat ironic, first of all,
  78. that you would have the growth of this
  79. next iteration flowing out of the same principle
  80. anonymity, but it does make sense especially because
  81. when you have this race to the bottom or slippery slope
  82. that’s the way it goes.
  83. It continues going down.
  84. But, I also think it’s interesting how when you
  85. look at the kind of moral underpinnings
  86. why the founders of cryptocurrencies and Bitcoin
  87. in particular, why they created those currencies
  88. in the first place it very much, like all Ulbrecht
  89. and Silk Road was in and of itself kind of based
  90. on moral principles, the idea that you wanted
  91. to decentralise the marketplace.
  92. You wanted to democratise finance.
  93. And, in many ways allow people to bypass governments
  94. and in current forms of currency, right.
  95. And so, it’s interesting that very much like
  96. the Silk Road, and it’s not to say that
  97. all these uses are bad, certainly, but it is interesting
  98. how what was initially perceived as a moral,
  99. at least partially a moral conviction is now in some ways
  100. being, again, I don’t wanna say misused,
  101. but now being utilised in ways that perhaps
  102. weren’t initially anticipated.
  103. – Sure, and so that’s really interesting
  104. because I think if you talk to
  105. kind of visionaries who kind of have a real strong view
  106. about the role of cryptocurrencies it goes
  107. right to your point about they imagine,
  108. many of them imagine a world where actually
  109. fiat currency is replaced by cryptocurrency as part of that.
  110. Because, fiat government is tied to, or fiat currency
  111. is tied to governments and central banks
  112. that that mechanism they feel is increasingly archaic.
  113. – Inherently oppressive. – Could be, could be.
  114. And so, going to more transparent system,
  115. a more distributed system of cryptocurrencies
  116. is kind of, that’s what they think the future will be.
  117. Like you’re right, there’s a great deal of irony there
  118. because now not only do you have governments
  119. trying to regulate it more they’re also getting involved
  120. in the use and production of it as well, potentially.
  121. And, there’s these kind of these minor examples
  122. of governments who have come out and said,
  123. “Hey, we may want to try to kind of issue
  124. “our own kind of cryptocurrency.”
  125. And so, there’s a great deal of irony there.

Additional Readings

  • Explainer: ‘Privacy Coin’ Monero Offers Near Total Anonymity. (2019). New York Times. Retrieved from: https://www.nytimes.com/reuters/2019/05/15/technology/15reuters-crypto-currencies-altcoins-explainer.html 
  • Jardine, E. (2018). Privacy, Censorship, Data Breaches and Internet Freedom: The Drivers of Support and Opposition to Dark Web Technologies. New Media & Society, 20(8), 2824–2843.
  • Piazza, F. (2017). Bitcoin in the Dark Web: A Shadow over Banking Secrecy and a Call for Global Response. Southern California Interdisciplinary Law Journal, 26(3), 521–546.

2.3.6 Case Study – Silk Road: Trust and Accountability

  1. For me, the first question is,
  2. you know, we talk about this thing called the dark web,
  3. it sounds evil.
  4. Is the dark web, is that an evil thing?
  5. What do you think, Dave?
  6. Yes!
  7. No.
  8. I think, it does show one of the key things that we’re gonna
  9. talk about as we get into the ethics of
  10. financial technology, is the way we define these things,
  11. the way that we describe them,
  12. even how we name them,
  13. will color people’s perception of them.
  14. So our bias towards something can be projected,
  15. not only in the code that you create for, let’s say,
  16. AI going forward, but also in terms of,
  17. again, just the way we characterize these technologies.
  18. So clearly this term dark web,
  19. was probably put forward by individuals who
  20. wanted this to be perceived as primarily a negative thing,
  21. perhaps from a policing or national security standpoint.
  22. But, the reality is, as David made clear,
  23. these technologies were actually created by
  24. the US government for secure communications
  25. between various elements of the US military.
  26. And, there are so many aspects of these technologies
  27. that are utilized every day in order to protect
  28. us and provide us with privacy.
  29. This is one of the major dichotomies that we have
  30. not only in terms of FinTech,
  31. but broadly in terms of regulating privacy
  32. and information in general.
  33. And this is something that has been going on for
  34. quite a long time.
  35. Because, when you talk about ethics,
  36. most people when they talk about ethics,
  37. primarily focus on what is legal,
  38. they focus on the law.
  39. On the one side you have lawyers like us,
  40. who teach people about the hardline rules
  41. – black and white rules, about what is acceptable
  42. and what is not.
  43. And those have been largely defined by society
  44. through the codes and laws that we have in place.
  45. On the flipside, you have the moral, more ambigious,
  46. sometimes subjective aspect of ethics,
  47. where this can be related to culture or history
  48. or even religion – so many aspects of culture
  49. that built into what is perceived to be
  50. acceptable in society.
  51. And, governments – the pendulum of regulation
  52. swings back and forth,
  53. in terms of how much to regulate,
  54. and then how to back off that regulation.
  55. So, to use an example,
  56. I think if you were to go to someone and you would say
  57. do you utilize communication tools like WhatsApp
  58. for example, and do you find that those
  59. communication platforms are valuable because
  60. they encrypt the communication.
  61. I think most people would say unequivocally,
  62. yes, of course.
  63. Right.
  64. If you were to say
  65. I provide you this software,
  66. but, someone from the NSA or someone from
  67. the police is going to be listening to all your
  68. communication and documenting that communication
  69. – I think most people would have averse,
  70. visceral reactions to that.
  71. So, we want that for ourselves in terms of
  72. privacy and ownership of our own data,
  73. control of what the world knows about us.
  74. But then the flipside is, there are very valid concerns
  75. in terms of safety, in terms of national security,
  76. and so you’ll see scenarios where,
  77. like the San Bernadino shooting,
  78. where big segments of the population
  79. – even though they for themselves would advocate
  80. for privacy and security –
  81. were simultaneously asking Apple,
  82. hey you gotta jump on this, you’ve gotta crack this phone
  83. so that we can ensure these types of attacks
  84. don’t continue occurring.
  85. And I feel like, this is where we are right now
  86. in terms of this dichotomy, this paradox,
  87. in terms of privacy for ourselves
  88. and the broader social good.
  89. So can we, can a single country manage that debate?
  90. Absolutely not, and this is the issue.
  91. Again, if you go back to regulation as an example,
  92. whether it’s trade, whether it’s financial regulations
  93. – even contracting.
  94. Simple things like contracting.
  95. There are challenges when initiating these types of
  96. transactions and legal relationships
  97. in a broad cross-border standpoint.
  98. And when you especially from anything
  99. that’s related to technology,
  100. especially if it’s on the internet,
  101. you’ve got servers that are hosted in multiple countries,
  102. you’ve got pretty much everything going through the US
  103. at some point right now.
  104. And you have some countries like the US
  105. that have a very broad mandate in terms of
  106. extraterritoriality to their law enforcement,
  107. where they will go into another country
  108. – they will actually nab people, very much like executives
  109. of Chinese companies that have been nabbed.
  110. Not even on US soil related to what the US government
  111. views as their right to enforce regulation.
  112. And then you have other governments that are
  113. completely stand-off and don’t even have
  114. regulation in these standpoints.
  115. Now, another example would be here in Hong Kong,
  116. very very small place, but it is a finance centre
  117. and a FinTech centre.
  118. And, a lot of the data that is here is actually
  119. hosted outside of Hong Kong, and so,
  120. the very aspect if you set up a bank account,
  121. if you click on, you know, iTunes,
  122. and you agree to your data being collected,
  123. what you may not realize is a lot of the times that data
  124. is actually stored elsewhere,
  125. so you have multiple privacy and data ordinance
  126. that regulations that are going to apply
  127. just to that one subset of data.

Additional Readings

  • Chen, J. (2019). User Choice: The Simple Solution to the Tech Trust Crisis. World Economic Forum. Retrieved from https://www.weforum.org/agenda/2019/02/user-choice-the-simple-solution-to-the-tech-trust-crisis/

2.3.7 Case Study – Silk Road: Privacy

  1. So that’s a great explanation – I think a great way
  2. for us to start and thinking through the topic.
  3. I guess it fundamentally gets to a core question:
  4. do people, that use these technologies,
  5. be it the dark web, the deep web or just normal everyday
  6. applications that everybody uses,
  7. do they have a fundamental right to privacy in their use.
  8. Or, by virtue of saying:
  9. hey I want to use this application,
  10. are we basically saying I am giving up some level,
  11. some measure of privacy,
  12. and is that why that pushes people into using things
  13. that are below surface internet, be it
  14. the deep web or the dark web?
  15. This has been a question in terms of the right to privacy
  16. – it’s not a new question – several centuries old question
  17. that goes back to deeply held moral and legal beliefs
  18. in terms of the rights to privacy.
  19. So a lot of major legal questions including abortion,
  20. and other things around the world actually
  21. get back to the same question of right to privacy.
  22. What right do I have to engage in an activity within
  23. my own home as long as I’m not harming other people.
  24. And this is just an extension of that,
  25. where this data is being projected publicly and it’s
  26. a really complicated issue.
  27. Because, on the one hand, when you say the right,
  28. well, first, there has to be a granting of that right.
  29. There either needs to be a legal principle
  30. for example, within a constition or within the law that
  31. says you have the right to this particular thing
  32. – in this case,
  33. the ownership or control of of your own data.
  34. Then you may even have a higher moral right,
  35. so kind of an Aristotle or even religious
  36. right to privacy.
  37. Say, I’m an individual and therefore I have
  38. the right to control who I am, my own image,
  39. my own likeness, the way I’m projected to society.
  40. But then, beyond that, you then have those kind of
  41. daily ticky-tack opportunities that are contractual
  42. in nature where we often give away these rights
  43. – and we agree to, not a violation of privacy,
  44. but certainly limitation in terms of our privacy
  45. and our own data.
  46. At least eroding our privacy.
  47. Yeah exactly.
  48. And so a great example of this is,
  49. just recently in one of my classes,
  50. I had a number of students sit down
  51. and read through the terms and conditions that
  52. they had to click to accept to use a particular,
  53. very well-known app on the phone.
  54. You can say it.
  55. Well, I don’t want to put them on the spot.
  56. And, you know, that utilizes photographs,
  57. and to every student
  58. had never read these terms and conditions,
  59. though all of them were using it.
  60. They all use it. Yeah exactly.
  61. Almost all of them are using the application,
  62. none of them had ever read the terms and conditions
  63. – and as we went through it clause by clause
  64. there was many things that surprised them.
  65. Particularly about the ownership,
  66. not necessarily the ownership, but the use of their data,
  67. and I think this will become a broader issue
  68. particularly when it comes to financial data as well.
  69. Yeah, and we haven’t really gotten into AI
  70. and facial recognition software yet, but just imagine:
  71. we have potentially thousands and thousands of
  72. images of our face, of our facial expressions,
  73. that are out there now – that we have provided
  74. to a public, well, it’s not supposed to be public,
  75. but essentially to apps and other websites that
  76. we are giving them the right to publish these things
  77. often very publicly.
  78. And when you get into things like deepfakes with
  79. video technology that can now take images
  80. from an app, say like Instagram or Facebook,
  81. and then actually alter them in a way that
  82. creates videos that are very life-like,
  83. that are very realistic.
  84. This is where I think, several years into the future,
  85. I think people are really going to question
  86. why they were so willing to put images of
  87. themselves on the Internet.
  88. There is one interesting side-note, again,
  89. not to bring this back to parenting,
  90. but I have talked to a lot of parents in terms of
  91. the way they utilize or allow their children to utilize
  92. smart phones within their personal lives. Right.
  93. This is something that I think we are all still kinda
  94. wrestling with, because we don’t understand the
  95. implications of this.
  96. So one of the things that my wife and I decided
  97. to do is to never, or at least not for an extended
  98. period of time, put photos of our children online.
  99. And the primary reason goes back to this concept,
  100. this fundamental right to privacy.
  101. Who has the right decide to put your image
  102. publicly on the Internet.
  103. And so, the example that I provided in the past is:
  104. imagine if you’re going to a job interview at 21 years old,
  105. your first interview,
  106. and your potential employer has access to
  107. 10,000 images of you from the time that you were born
  108. to the time that you graduate
  109. – and you never consented to that,
  110. you were never asked whether or not
  111. that was a good or allowable,
  112. but you put it on there.
  113. And again, this is not prescribing a moral solution
  114. to other people,
  115. but this is an example of how we as society now
  116. have to go back.
  117. Now that the technology is out there,
  118. we now have to, from a culture-lag perspective,
  119. we now have to go back and re-define
  120. how we are willing and content
  121. to engage and utilize that technology.

Additional Readings

  • What If: Privacy Becomes a Luxury Good? (2017). World Economic Forum. Retrieved from https://www.weforum.org/about/what-if-privacy-becomes-a-luxury-good

2.4.1 Case Study – Blockchain and Foreign Remittances

  1. For our next case,
  2. let us tell you a little bit about Hong Kong,
  3. where both of us have lived for approximately 10 years.
  4. As many of you know,
  5. Hong Kong is dynamic, global,
  6. and one of the most interesting cities in the world.
  7. A part of Hong Kong’s story that
  8. most casual observers are not aware of
  9. is that embedded within Hong Kong’s cosmopolitan make-up
  10. are hundreds of thousands of women that provide
  11. childcare, home care, and other household duties
  12. for many of Hong Kong’s families.
  13. These women are designated as Foreign Domestic Workers,
  14. but usually referred to as “helpers”, “a-yi”, or “Aunty”.
  15. There are approximately 400,000 of these women
  16. working in Hong Kong,
  17. most hailing from the Philippines or Indonesia.
  18. These women are generally paid around
  19. US$570 a month or roughly US$ 7,000 a year,
  20. most of which is remitted back to their home countries
  21. to support their families.
  22. The reality is that most of these women
  23. work really hard for a salary that you and I may not consider that high,
  24. but that salary is almost doubled the
  25. GDP per capita in the Philippines.
  26. And in the aggregate, these remittances by overseas workers,
  27. according to World Bank data,
  28. account for approximately 10% of the Philippines’ GDP.
  29. So individually and at a national level,
  30. the money really adds up and the
  31. impact of these wages is a very big deal.
  32. Now how does this money in Hong Kong make its way
  33. to a family living in a village somewhere in the Philippines?
  34. Well, besides being one of our course instructors,
  35. we are really fortunate that David Bishop is one of the world’s
  36. foremost experts on issues related to domestic helpers
  37. and how to protect them from exploitation.
  38. So, let’s hear it from him about the issues
  39. these women face when sending money back home.
  40. So, for you out there,
  41. you might think that if you were going to send money,
  42. maybe you have a bank account
  43. and you would just do a bank transfer.
  44. Simple, problem solved.
  45. Unfortunately, for the tens of millions of migrant workers around the world,
  46. this is usually not possible,
  47. since they are generally unbanked on both sides.
  48. Meaning, the foreign domestic workers in Hong Kong,
  49. many of them don’t have a bank account here in Hong Kong
  50. and their families on and other side,
  51. people they are sending money to,
  52. they typically don’t have a bank account either.
  53. So the workers receive their wages in cash
  54. and they have to figure out how to get that cash
  55. from Hong Kong to their family
  56. in a remote village somewhere perhaps in the Philippines.
  57. To fill such needs, money remittance companies
  58. have sprung up all over the world,
  59. the most famous probably being Western Union.
  60. And for decades this is how people transferred money.
  61. As part of this process,
  62. there are two important things to note,
  63. which might not be apparent.
  64. First, there is a physical component when remitting money.
  65. A worker has to physically go to one of these locations
  66. to actually hand them cash.
  67. Then on the other side,
  68. there is another physical location,
  69. where the receiver has to go to pick up the money.
  70. So both sending and receiving is a very time and labour-intensive process,
  71. due to standing in lines, walking long distances,
  72. and perhaps waiting for and using public transportation,
  73. which comparatively might not be cheap.
  74. In addition, many of these workers only have one day off a week,
  75. usually Sunday, so much of that day could be
  76. wasted trying to send money home.
  77. Second, is an issue of financial literacy.
  78. These money remittance companies charge fees
  79. that you and I may consider excessive,
  80. sometimes as high as 8 or 9% per transfer.
  81. Additionally, currency conversion fees are typically not competitive.
  82. So, even if a remittance company has a low rate for sending money,
  83. they will likely make money on the currency-conversion,
  84. like when converting from, say HKD to Philippines pesos.
  85. On top of all that, sometimes remittances can take time,
  86. at least a few days if not longer.
  87. I’m not saying these companies shouldn’t make money
  88. for providing a service,
  89. but frequently their customers are not really that tinformed
  90. or have limited options.
  91. So a natural question is:
  92. what if that lost friction or time,
  93. or unnecessary fees can be avoided or at least reduced?
  94. For many, the answer to that question,
  95. or at least an important component to that question,
  96. is the use of blockchain technology.
  97. Today there are a number of remittance services
  98. that are trying to employ some level of blockchain to minimize
  99. many of the frictions that we have discussed,
  100. by promising to make remittances more efficient,
  101. secure and/or affordable.
  102. As these innovators pressure incumbents,
  103. there will be a shift, first a trickle but then a wave,
  104. as users become comfortable adopting new technology,
  105. bridging any cultural lag and
  106. learning to trust new advances in technology.
  107. Overall, the breakthrough in blockchain
  108. is really exciting and will be a game-changer
  109. for many of these workers as well as the
  110. millions of people around the world that transfer money daily.
  111. And in Module 6,
  112. we’ll be looking into a really cool
  113. blockchain remittance service company,
  114. called BitSpark,
  115. that was formed right here in Hong Kong
  116. So stay tuned for that.

Additional Readings

  • Massimo, F. (2018). How Blockchain-Based Technology Is Disrupting Migrants’ Remittances: A Preliminary Assessment. Publications Office of the European Union. Retrieved from http://publications.jrc.ec.europa.eu/repository/bitstream/JRC113484/how_blockchain_is_disrupting_migrants_remittances_online.pdf 
  • A Migrant Centered Approach to Remittances. (2017). International Labour Organization. Retrieved from http://www.ilo.org/global/topics/labour-migration/policy-areas/remittances/lang–en/index.htm
  • Record High Remittances Sent Globally in 2018. (2019). World Bank. Retrieved from https://www.worldbank.org/en/news/press-release/2019/04/08/record-high-remittances-sent-globally-in-2018 

Module 2 Conclusion

  1. We believe blockchain has the potential to be a revolutionary technology,
  2. like the Internet 20 years ago.
  3. It’s truly exciting to consider its possibilities.
  4. But like many such technologies,
  5. there are implications of their widespread use
  6. that are not always initially apparent and difficult to address
  7. once the technology has become widespread.
  8. In such situations,
  9. it’s helpful to use frameworks to consider risk and implications.
  10. One such framework that we’ve found to be meaningful is
  11. “The Blockchain Ethical Design Framework”.
  12. This Framework was written by Cara LaPointe and Lara Fishbane
  13. and was published by Georgetown University’s Beek Center,
  14. which focuses on Social Impact and Innovation.
  15. We’ve included a link to the report below.
  16. The framework they use is focused around six guiding questions
  17. when using blockchain as a solution:
  18. How is governance created and maintained?
  19. How is identity defined and established?
  20. How are inputs verified and transactions authenticated?
  21. How is access defined, granted, and executed?
  22. How is ownership of data defined, granted, and executed?
  23. And finally, how is security set-up and ensured?
  24. These are important questions
  25. and such questions can assist each of us to think deeply about
  26. the impact of blockchain technologies—
  27. either as a user or if we intend to deploy blockchain
  28. as a solution to solve a problem.
  29. More broadly, this course is ultimately about asking questions
  30. and having the desire and courage to do so.
  31. We will return to many of the themes that have emerged
  32. as we have explored blockchain and will continue to ask such questions
  33. as we consider other technologies and their applications
  34. in later modules.
  35. Ok look, we think blockchain is really really exciting,
  36. really game-changing technology.
  37. And so from our perspective, we want you to think about blockchain
  38. in the same way that we think about the Internet 20 years ago.
  39. In the mid-90s, there was this advent of a new age of information,
  40. this thing called the Internet that everyone was exited about.
  41. And as a result of that,
  42. it gave us access to more knowledge
  43. and made knowledge more available to people
  44. than at any point in human history.
  45. Over the last few years,
  46. in the wake of fake news, echo chambers, etc.
  47. as the cultural lag is catching up,
  48. we now have realized that this technology has also come at a cost.
  49. In the next module,
  50. we’ll be looking at compliance, and regulation and rules.
  51. But the thing about law is that they are typically retrospective,
  52. the look backwards.
  53. Therefore, it is important for us,
  54. collectively, individually, but especially as a society,
  55. to proactively look ahead and think about
  56. what it is that we are willing to pay,
  57. what it is that we are willing to give up
  58. in order to have these technologies in our lives.

Additional Readings

  • Lapointe, C., & Fishbane, L. (2018). The Blockchain Ethical Design Framework for Social Impact. The Beeck Center for Social Impact + Innovation. Retrieved from http://beeckcenter.georgetown.edu/wp-content/uploads/2018/06/The-Blockchain-Ethical-Design-Framework.pdf

Module 2 Roundup

  1.  Hey everybody.
  2. We’re back with our roundup for week two.
  3. We really appreciate all the active participation
  4. this week in the discussion board.
  5. There were several really engaging discussions
  6. and questions that came through in the forums,
  7. which is always really great to see.
  8. – Yeah, honestly, one of our hopes as we design the course
  9. is that it would not be unilateral.
  10. Meaning us just giving content to you all,
  11. but multilateral in the sense that you’re participating
  12. and engaging and pushing us as well,
  13. which really seems to be happening.
  14. Seeing this type of activity
  15. is really rewarding and gratifying for us,
  16. because we think this is how
  17. some of the best learning occurs.
  18. – Now we hope to continue that in the next module.
  19. As all of you are aware, in module one,
  20. we provided a broad framework,
  21. by which to analyse some of the core technologies
  22. at the heart of FinTech,
  23. and in module two, the first technology
  24. we really consider is blockchain.
  25. From the comments in the discussion forum,
  26. it seems many of you have been
  27. thinking about similar questions.
  28. We wanted to spend the next few minutes
  29. following up on some great questions
  30. and contributions that were made.
  31. We want to revisit some of the discussion questions
  32. that we thought were really interesting and compelling
  33. in a few different ways.
  34. The first one that really comes to mind,
  35. started off with a comment that RichardStampfle made
  36. about the idea of free markets,
  37. and that generated a lot of back and forth
  38. between a number of course participants
  39. including Jstout84 and Peter-NYC,
  40. and a number of other student participants
  41. which we thought was a really great interaction
  42. and the type of multilateral discussion
  43. that we want to see hopefully generate in the forums.
  44. Dave, what are your thoughts on this broader topic
  45. about this idea of free markets?
  46. How it operates in the context
  47. of some of the principles
  48. that we’re discussing in the course?
  49. – Okay, so we’re gonna solve
  50. one of the great questions of capitalism and economics
  51. here in the next five minutes.
  52. – Five minutes.
  53. Sure, this is how we operate.
  54. – Sure, no problem.
  55. No, I mean the underlying question,
  56. and first, maybe just as a side note,
  57. some of the comments are really really well thought out
  58. and so we appreciate the high level of engagement.
  59. It’s clear that you guys are pretty damn smart,
  60. so we really appreciate the engagement.
  61. It’s been really good.
  62. Making us think a lot.
  63. Free markets versus government regulation.
  64. This is the age old question from an economic standpoint,
  65. and I think when talking about FinTech Innovations,
  66. and especially blockchain,
  67. this is that intersection,
  68. where I think people really
  69. start to have significant disagreement.
  70. Certainly, within a government regulatory standpoint,
  71. that is the reason why most governments
  72. are weary of this type of,
  73. especially with crypto currency.
  74. Who is able to control it?
  75. One of the commenters it was Jstout84.
  76. I’m just gonna read this,
  77. because he had a really good quote from Thomas Hobbes.
  78. He said, quote, “People live nasty brutish lives.
  79. Always seeking to undermine each other.”
  80. Close quote, and he said is it too idealistic
  81. to think that completely free markets can function
  82. for the benefit of all,
  83. and I think again, this is the question for this course.
  84. People call it the fourth industrial revolution.
  85. I think from my standpoint it’s really
  86. just capitalism 2.0 or 3.0.
  87. Can we as a global society,
  88. come together in such a way
  89. so that these massive, currently disparate resources,
  90. can be shared in such a way so as to
  91. maybe not make complete equality,
  92. but make it so people don’t feel so disengaged,
  93. so separate.
  94. – Marginalised.
  95. – And then potentially which,
  96. obviously can lead towards violence
  97. and other types of challenges.
  98. I think that is the great question,
  99. that the original commenter was asking,
  100. and I think in module five,
  101. we’re gonna take this question a little bit further,
  102. and really look into that.
  103. A lot of the FinTech Innovations
  104. were started based on these broad questions
  105. of decentralisation of power,
  106. or democratisation of finance,
  107. and really about eroding these power structures
  108. that have existed for centuries.
  109. Sometimes a millennia,
  110. and the question that we have for you in module five is,
  111. will governments,
  112. will large institutional holders of power like banks,
  113. will they actually allow that to happen,
  114. and I think again, that is the underlying
  115. fundamental question.
  116. I think to answer the specific question you asked,
  117. I don’t think it’s gonna happen any time soon,
  118. but it is really exciting.
  119. I think the way I would flip it around
  120. is if FinTech works the way that some people
  121. think that it will,
  122. will they have any power to stop it,
  123. or is it just an eventuality,
  124. where cryptocurrency,
  125. various forms of decentralised systems,
  126. will make it so that the very concept of government
  127. or finance and stuff just becomes eroded
  128. and just transforms into something new.
  129. I think that’s really.
  130. Not anytime soon,
  131. but it’s a very interesting hypothesis.
  132. – I think maybe to piggyback on some of those points,
  133. ultimately it just comes down to
  134. a tension between where we think regulation plays
  135. versus where we think this invisible hand idea.
  136. That Adam Smith talked about,
  137. and I think the way we think about it,
  138. is they exist on a spectrum
  139. and that spectrum can change
  140. depending on which industry we’re talking about.
  141. Depending on which country we’re talking about,
  142. and which microcosm of the economy we may be talking about
  143. at a particular time,
  144. but do some new technologies that come into play,
  145. maybe some things we talked about module five,
  146. or things like smart contracts,
  147. or other things that we end up touching on,
  148. how does that help facilitate,
  149. less friction in transactions?
  150. Which is ultimately at the heart,
  151. I think of trying to,
  152. at least one school of thought is
  153. when we bring regulation into financial transactions
  154. and the economy, this is one school of thought.
  155. We want to remove transaction costs as much as possible,
  156. but at the same time have a level of fairness,
  157. and protection of certain players in the game as well,
  158. and so this is an interesting tension.
  159. Unfortunately, I don’t think we answered it
  160. in five minutes, sorry.
  161. – No, we probably didn’t.
  162. I mean again, you guys can answer these questions
  163. in the forum perhaps better than we can.
  164. It’s clear that you guys are really really
  165. highly thoughtful on these types of questions,
  166. but I do think it brings up,
  167. another thing that within the forums
  168. and within the questions you’re asking,
  169. does seem to come up over and over again.
  170. There’s this interesting dichotomy,
  171. or in Chinese, they say,
  172. (foreign language speaking)
  173. when these two opposing forces,
  174. that a lot of you have identified
  175. in this kind of FinTech space,
  176. and the idea is that on the one hand
  177. we don’t trust traditional financial institutions,
  178. maybe even governments,
  179. so therefore we want to,
  180. we hope that financial technology innovations
  181. push us towards a more democratised, decentralised system,
  182. and yet when we asked you who do you trust, it was.
  183. – Traditional financial institutions.
  184. – Traditional financial institutions, right,
  185. and the reason why,
  186. the underlying reason why,
  187. is because there wasn’t a track record
  188. and more importantly there wasn’t
  189. a regulatory structure that provided
  190. that kind of social safety net,
  191. the insurance, the other types of,
  192. basically that framework that ensures
  193. that if you invest in some cryptocurrency
  194. or other system, that your money will be safe,
  195. and I think this is such an interesting dichotomy
  196. that we’re running up against as a society,
  197. and it leads into for example,
  198. one of the things that we talked about this time,
  199. smart contracts.
  200. So we had people talking about
  201. how smart contracts could or would,
  202. or maybe even are influencing their lives,
  203. and so people, a lot of people
  204. touched on real estate transactions.
  205. They touched on related to real estate
  206. was the actual storage of documents,
  207. like deeds and other land records
  208. in government systems.
  209. What were some of the things that stood out to you
  210. in terms of some of the issues or ways
  211. that people saw smart contracts,
  212. becoming more relevant in our lives,
  213. that you think stood out?
  214. – In terms of the relevance of smart contracts?
  215. – Yeah.
  216. – Yeah, so I think smart contracts,
  217. it’s kind of almost like a buzzword.
  218. It sounds great.
  219. I think a lot of legal structures,
  220. or just law in general,
  221. we want to make sure that we get
  222. the right structure in place,
  223. before we obviously implement,
  224. and this is I think where,
  225. some of the potential issues could come,
  226. because if we don’t think about this comprehensively,
  227. then it can create potentially more problems than it solves
  228. or at least they kind of cancel each other out,
  229. and so that’s probably where the real concern is
  230. for me in particular.
  231. – Can I give a quick example of that?
  232. – Yeah.
  233. – One of the things that came up regularly in the comments
  234. was real estate transactions,
  235. especially the buying and selling of homes,
  236. and it was really clear that a lot of us
  237. are tired of paying all those middle man fees.
  238. Tired of paying for real estate agents.
  239. – Broker fees.
  240. – Yeah, broker fees.
  241. – Lawyer fees.
  242. – Lawyer fees.
  243. There’s so many people in the middle
  244. that are grabbing pieces of that transaction,
  245. so I think very naturally,
  246. because that is the largest investment
  247. that most people make,
  248. that is also the largest type of transactional fee
  249. that we tend to pay,
  250. and it’s really easy to look back
  251. and say, I’m buying this house.
  252. I’m taking all the risk.
  253. Why in the hell am I paying this money, this fee,
  254. to a real estate agent,
  255. who just unlocked the door for me?
  256. It does seem very natural.
  257. Now if you didn’t remember from my background,
  258. my original legal background
  259. was in commercial real estate,
  260. and let me give you the flip side to that coin,
  261. because I think this goes exactly to your point.
  262. The reality is that real estate agent,
  263. and agent is a legal term, right?
  264. That means a fiduciary relationship,
  265. which we talked about already.
  266. This idea that it is someone who is
  267. put in a position of trust,
  268. and therefore they have higher standard of trust
  269. and legal requirements,
  270. because they are meant to be there
  271. to help you and guide you.
  272. – Normally when we talk about agent,
  273. we talk about principal and agent.
  274. – Exactly.
  275. – Somebody delegating trust, authority,
  276. power to the agent, right?
  277. – Yep, exactly.
  278. – In this case the property owner.
  279. – Yeah, property owner, or the purchaser,
  280. you have to use an agent, legally, in many cases,
  281. in order to go through that process,
  282. and again it seems so unnecessary,
  283. and why would we pay that person
  284. for doing something so little,
  285. and just the one thing that I want you to look at
  286. going forward is,
  287. you’re not necessarily paying them that fee
  288. for what they are doing,
  289. you’re paying them that fee,
  290. in order to ensure they do something if it goes wrong,
  291. and this is the problem.
  292. This is why we have that same dichotomy,
  293. with the traditional financial system
  294. versus that blockchain based cryptocurrency.
  295. Whatever FinTech system,
  296. is you are paying the lawyers.
  297. You’re paying the title insurance company.
  298. You’re paying the brokers.
  299. You’re paying all of those people along the way
  300. to protect you if something goes wrong.
  301. 99.999, whatever percent of the time,
  302. nothing goes wrong,
  303. and so therefore it seems like that was a waste.
  304. – A wasted cost.
  305. – Yeah, exactly, a wasted transaction.
  306. You think oh my gosh, they got $1,000 for doing nothing.
  307. I can tell you through,
  308. as a former corporate lawyer,
  309. in the commercial real estate space,
  310. it is absolutely money well spent most of the time
  311. and I’ll give you a quick personal example.
  312. My wife and I purchased a home at foreclosure
  313. before we moved to Hong Kong about 12 years ago.
  314. We recently sold that home in 2017,
  315. and when we sold that home,
  316. we realised because the lawyers found it,
  317. that they had actually recorded
  318. the legal description of the land incorrectly,
  319. and because of that they actually
  320. had to go and find the original owner,
  321. get them to sign a new deed,
  322. a corrected deed is what it’s called
  323. to have the actual, the proper.
  324. – Proper description.
  325. – Description on there,
  326. and if they didn’t have that,
  327. then the buyer would not have been
  328. able to get all of their parties to line up.
  329. The mortgage company, the title insurance company, et cetera
  330. and we would have been stuck with a house
  331. that we would not be able to sell, right?
  332. So on the one hand it means
  333. that the original lawyers and stuff
  334. didn’t really do their jobs,
  335. but on the other hand it meant
  336. that because we paid these fees to these people,
  337. they were there to protect us,
  338. and I think again, this gets at the heart,
  339. I’m not saying that they’re worth all of their money.
  340. I’m not saying I don’t also feel the sense of anger
  341. when I have to pay someone a fee
  342. that I don’t really necessarily think
  343. they they deserve,
  344. and as a lawyer I’ve been on the other end of that.
  345. Probably received money that maybe I didn’t deserve
  346. in the traditional sense,
  347. but the reality is,
  348. the system is there specifically
  349. to deal with that dichotomy that we’re now facing.
  350. We want that protection and we need it,
  351. but now technology could give us extreme efficiency
  352. but with that efficiency comes less certainty
  353. and less protection, so the question is,
  354. how do we develop the efficiency,
  355. but also maintaining the protection?
  356. I think that’s really hard.
  357. If it’s self executing,
  358. that’s really hard.
  359. – That’s very difficult,
  360. and so I think at the heart
  361. of a lot of the technology we talked about,
  362. be it from blockchain or AI powered mortgage
  363. lending decisions or anything along those lines,
  364. I think that ability to have recourse
  365. is really at the heart of a lot of these things,
  366. when we talk about free markets, regulation,
  367. the role of the law.
  368. How does that play?
  369. Ultimately, at the end of the day,
  370. if something does go wrong,
  371. you want to be able to have some level of recourse.
  372. – Absolutely.
  373. – Ideally being able to talk to somebody.
  374. – Absolutely.
  375. – I think a large part of.
  376. – And the bigger the dollar value is,
  377. the more you’re gonna want that.
  378. – Want that, right?
  379. I think a large part of the issues
  380. that we face when it comes to new technologies
  381. is that up till now there’s no,
  382. nobody has really articulated
  383. a solution to what that recourse would be specifically,
  384. so if a particular, a block is incorrect,
  385. if whatever AI decision making that’s going on
  386. in a particular company comes out for whatever reasons,
  387. wrong decision,
  388. how, what is the recourse of the person
  389. who is impacted by that?
  390. – Yeah.
  391. – I think again, that cost,
  392. both in financial but in time emotion,
  393. is pretty steep for the average person.
  394. – Very significant, yeah.
  395. Going on to some of the other comments that you have
  396. in regards to blockchain,
  397. and the way that it could potentially impact our lives,
  398. there was some things that stood out to me
  399. that I hadn’t really thought about,
  400. especially in terms of my day to day life.
  401. Some commenters talks about traceability of products.
  402. So product liability is a very serious thing.
  403. You want to know that the food that you’re eating is safe
  404. or that the diamond that you’re purchasing is accurate,
  405. or whatever, like it’s described accurately,
  406. and there are some really interesting descriptions
  407. about that.
  408. Here in Hong Kong someone mentioned the milk powder
  409. which is sometimes difficult to trace,
  410. and there have been concerns over milk powder.
  411. – Baby milk powder.
  412. – Baby milk powder.
  413. There’s some concerns several years ago
  414. about the source of milk powder and what it contains,
  415. and maybe if there’s some traceability on that.
  416. Maybe in terms of fair trade, things of that nature.
  417. One thing that wasn’t mentioned,
  418. that I thought would come up,
  419. because this has come up globally,
  420. in certain contexts, is actually voting,
  421. and voting not in terms of cryptocurrency,
  422. which we did talk about,
  423. but actually voting in terms of governments.
  424. – Political elections.
  425. – Political elections, yeah.
  426. The immutability of the blockchain,
  427. does mean that theoretically,
  428. if you wanted to increase the number of voters,
  429. then the best way to do that
  430. is to not make them physically
  431. go to a polling location,
  432. but let them do that on a mobile application somehow,
  433. that didn’t come up,
  434. but maybe throw that back at you.
  435. Is that something you think that governments
  436. would potentially allow at some point?
  437. – I think the other thing that’s really interesting
  438. when it comes to blockchain and application.
  439. I think one or two commenters did talk about this,
  440. or alluded to it at least,
  441. is the idea of property records, or title.
  442. How do we track that?
  443. This is obviously super important for governments
  444. as well as homeowners like you were talking about
  445. in your own personal experience.
  446. I think in a lot of places in the world,
  447. particularly where database records are not
  448. as comprehensive as people would hope they would be
  449. or not as clear,
  450. one solution that people are hopefully
  451. pointing to is blockchain,
  452. and we see this in countries where
  453. that title record or the deed record
  454. is really spotty, in some senses,
  455. and if you were to be able to get that in place,
  456. then you would be able to clarify
  457. a lot of potential issues,
  458. and unlock a lot of value for people
  459. that own this property to be able to utilise it
  460. in different ways.
  461. The real impediment or at least
  462. one of the key impediments is actually doing that though
  463. is putting the right records in to begin with.
  464. – It has to be proper in the first place,
  465. otherwise you’re just gonna have
  466. immutable bad data.
  467. – Then going back to try to fix that,
  468. it goes back to what we were talking about.
  469. – I don’t think many of you perhaps realise
  470. how inaccurate a lot of real estate records really are,
  471. or maybe you do recognise it,
  472. and that’s why you’re suggesting this,
  473. but taking Hong Kong as an example,
  474. it’s one of the most modern economies of course,
  475. and as a Commonwealth country,
  476. with the legal system where it stems from the U.K. system,
  477. if you studied in the common law,
  478. or if you worked in a common law country as a lawyer,
  479. then a lot of your legal training,
  480. can get transferred over.
  481. The vast majority of it gets transferred over,
  482. but here’s the thing.
  483. Every single lawyer, even if you
  484. went through the Commonwealth system,
  485. has to take the conveyancing course,
  486. because the legal conveyancy.
  487. – What’s conveyancy mean?
  488. – Conveyancing is when you transfer ownership
  489. of real estate to another person.
  490. It’s so messed up here,
  491. that everyone has to take it.
  492. – They have their own course that they have to take.
  493. – Yeah, everybody, no matter how much experience you have.
  494. I as a former commercial lawyer,
  495. in a common law country,
  496. that has the same legal background,
  497. they said if you’re gonna get this transferred over,
  498. you still have to take this conveyancing system or course
  499. because it’s so different than everywhere else,
  500. and the rumour is, I don’t know if this is true,
  501. but the rumour is that if you look at the deeds.
  502. If you go back far enough,
  503. any property in Hong Kong,
  504. you could say there’s a conflict
  505. in terms of ownership,
  506. and again, I’m not saying that’s true.
  507. The point is simply to show that.
  508. – Even in sophisticated markets.
  509. – Exactly, where they’ve been keeping records a long time.
  510. – If you think about markets
  511. that are for whatever reason less sophisticated,
  512. then there will be a litany of greater issues to address.
  513. – It’s almost like the less developed it is
  514. the more likely it could work,
  515. because then you almost have a clean slate.
  516. – This is a real difficulty in a lot of places in the world
  517. when it comes to real assets.
  518. Particularly property,
  519. and how those are gonna be conveyed or sold or used
  520. and to get clarity on this would actually help
  521. a lot of these countries and their economies.
  522. – Oh yeah, a tonne,
  523. and so going back, we mentioned voting,
  524. although in the political standpoint,
  525. but voting did come up in this module,
  526. so we want to talk about that briefly.
  527. One of the, some of the more interesting conversations
  528. I thought were again,
  529. very clear that many of you understand this
  530. as well or better than I do,
  531. especially on the technology side,
  532. was about the blockchain,
  533. and from a governance standpoint,
  534. the voting mechanism,
  535. whereby blockchain is controlled.
  536. We mentioned that the majority of blockchain
  537. and cryptocurrency is typically governed
  538. under a one vote system
  539. a majority rule system, excuse me,
  540. and we talked about is that the best way to do it?
  541. What were some of the ideas or thoughts
  542. that came up for you?
  543. – I think the great analogies that usually come up
  544. when we talk about how blockchain is governed
  545. or how these different industry groups,
  546. because there’s all these different kind of communities
  547. of different blocks and chains and applications
  548. that are out there.
  549. The initial analogy is always corporate law
  550. and the principle majority rule.
  551. Besides probably your elementary school teacher using that
  552. to say what are we gonna do next?
  553. Are we gonna go to recess?
  554. Are we gonna eat our snacks?
  555. I’ll let you vote.
  556. Beyond that.
  557. – That wasn’t my elementary.
  558. She was a dictator.
  559. – But beyond that,
  560. corporations as a general rule, one share one vote,
  561. which we mentioned in our course,
  562. and this is something that somehow
  563. is ingrainable to the political side,
  564. of how a lot of nations govern,
  565. but also on the corporation side.
  566. Now obviously there ae exceptions to that
  567. and some of the commenters had some great points
  568. related to that with respect to,
  569. what would be the alternatives?
  570. – Yeah.
  571. – So we think of super majority.
  572. Voting in the case of special resolution.
  573. Usually require a super majority,
  574. which could be 75%,
  575. and then is that the criteria we want to use
  576. if we want to fundamentally change things,
  577. or other people talk about cumulative voting,
  578. where people can load up votes
  579. on a particular item so to speak
  580. and then contradict their votes,
  581. which would help smaller minority type voters,
  582. and these are really interesting discussions that have.
  583. When we think about how we want to govern these things,
  584. because we know at least in certain types of
  585. blockchain communities so to speak,
  586. that there’s some concentration of power by
  587. concentration of either the tokens
  588. or whatever they’re using to use as your vote
  589. to say how many votes you’re gonna have.
  590. – This kind of came up.
  591. In case you’re not going there,
  592. several people mentioned a quote, 51% attack.
  593. What is that?
  594. – Yeah, so 51% attack is a little
  595. bit different in the sense.
  596. It all falls under the umbrella of governance,
  597. but it’s a little bit different
  598. in the sense of pure voting.
  599. In the sense of either.
  600. When we were talking about how for example, bitcoins,
  601. if we use that as an example,
  602. there are a number of computers out there.
  603. Large oftentimes very focused special computers out there
  604. that are doing mining to calculate
  605. some sort of mathematical problem,
  606. and when that’s solved,
  607. it pops out a coin for you basically, right?
  608. The idea of a 51% attack,
  609. what that relates to is,
  610. unless somebody owns the networking power
  611. in that particular chain of over 50%.
  612. Once they get to 51%,
  613. they could actually change the records in the ledger.
  614. In the blockchain ledger,
  615. they could change the record.
  616. Now until you hit that threshold you normally can’t.
  617. This is what they were talking about 51% attack.
  618. Is that a risk basically?
  619. Certainly, in certain communities
  620. that is going to be a risk.
  621. Particularly ones that are not distributed,
  622. and there’s that side of it.
  623. Oftentimes on the voting side,
  624. which will happen,
  625. if minors for example, or mining problems,
  626. and as a reward for mining the problem,
  627. they get a cryptocurrency of some sort.
  628. Sometimes certain communities will
  629. use that as the voting metric.
  630. If you’ve popped out 100 coins,
  631. and you have 100 votes maybe,
  632. to decide how things are going to work out,
  633. if we’re going to go this way or that way
  634. on a particular problem.
  635. I think those are where those two things are linked,
  636. but I think in terms of the voting,
  637. if we go back to I’ve got X amount of coins or whatever
  638. then should I be able to vote,
  639. one share, or one vote per whatever I have,
  640. or should there be some other type of system in place
  641. and I think this is where the comments
  642. and the debates were when it came to that.
  643. – Yeah, and this actually relates to
  644. it may not seem,
  645. when we threw in the section about the environment
  646. and we talked about electricity,
  647. we probably didn’t do a very good job
  648. of making it clear how related these two parts are.
  649. Let’s just talk about that briefly.
  650. We talked about the electricity component,
  651. to talk about how the utilisation
  652. of new technologies can have broader implications,
  653. often negative,
  654. and so therefore we need to think about those implications
  655. but here’s the connection to the governance part
  656. which we didn’t really explain very clearly.
  657. The idea is if it requires a lot of electricity,
  658. in order to mine these coins
  659. and to gain some level of control
  660. it also means that the people that are going
  661. to have the most control and the most shares
  662. of that whatever the crypto or blockchain is
  663. are also going to be the people
  664. that have the greatest access,
  665. to the electrical system,
  666. and so what many of you may not realise
  667. is that a lot of the people that control
  668. big percentages of coins and other blockchain based systems
  669. are actually quasi governmentally connected
  670. either through personal relationships
  671. or actual government support,
  672. where they can just use massive amounts
  673. of cheap electricity,
  674. in order to mine these massive mining farms,
  675. where they’ll mine these coins,
  676. and so that means that you have,
  677. if a government, let’s say,
  678. or someone connected to someone in power,
  679. wanted to gain control over those things
  680. and they had that massive ability,
  681. then they could theoretically gain
  682. some level of control,
  683. maybe even majority control,
  684. and change the rules within that system
  685. and this is an interesting again,
  686. this paradox where it’s not like,
  687. one coin, one vote system,
  688. where all of those coins are
  689. distributed equally around the world.
  690. It’s actually those that have literal access to power
  691. in this case electricity,
  692. are oftentimes the ones that then get
  693. to write the rules for those things.
  694. – Yeah, so there’s definitely a connection
  695. between those things,
  696. and I think if you’re the average cryptocurrency enthusiast
  697. where you’re buying fractions of bitcoin
  698. or ethereum or whatever,
  699. these are things you generally don’t think about,
  700. but I think clearly in a macro sense
  701. they’re definitely sociopolitical
  702. and socioeconomic links that are definitely driving
  703. or providing the structure to the market
  704. that we play in.
  705. – Exactly.
  706. – In the context of blockchain,
  707. and some of the other technologies
  708. and themes that we end up talking about
  709. through the course,
  710. we often talk about remittances.
  711. Particularly overseas remittances.
  712. I think you had an interesting
  713. experience with that recently.
  714. – I did, so this is one of the areas again.
  715. Probably because my interest in migrant workers
  716. and trafficking and other things in that area,
  717. that I personally am really excited about
  718. from a blockchain perspective,
  719. the idea of someone being able to transfer money,
  720. peer to peer very quickly.
  721. Perhaps using a mobile device, instantaneously,
  722. with very very low fees.
  723. I’m super excited about that.
  724. Not possible for everybody yet,
  725. but I think within the next three or four years,
  726. certainly five or 10,
  727. it will be available to pretty much everybody
  728. assuming countries allow their currencies to be converted
  729. and what not,
  730. but so this week actually,
  731. just by happenstance,
  732. was in a position, where I was transferring money
  733. to a friend of mine in the Philippines,
  734. and so I had to physically go into
  735. the downtown portion of Hong Kong.
  736. I had to go to this place called
  737. the Worldwide House.
  738. I had to write out something on paper,
  739. in this really old form,
  740. which by the way they messed up anyway,
  741. and they actually took my name down wrong
  742. on the transaction form,
  743. even though they had my ID and everything,
  744. and it was a cumbersome expensive process,
  745. relatively speaking.
  746. Now the money made it there,
  747. and so I think if you think over the past several decades
  748. these remittances have allowed migrant workers
  749. to really spread out across the globe,
  750. and provide for their families and friends,
  751. really from everywhere.
  752. It’s amazing in that regard,
  753. so don’t get me wrong,
  754. but I can see what’s coming next,
  755. and I can think about as David,
  756. you said in the module,
  757. even if you can just decrease
  758. those fees by 1% or 2%.
  759. – Big impact.
  760. – You’re talking about billions of dollars
  761. going to the developing nations of the world.
  762. I think it’s really really exciting.
  763. What I did, I never do this,
  764. but I actually took my phone
  765. and filmed using a little vlog or selfie,
  766. whatever it’s called.
  767. Whatever the kids are calling it these days,
  768. and made a little video of myself.
  769. We’re gonna put that together,
  770. and send that out to you as well,
  771. so you can see what we’re talking about
  772. when we talk about the remittance process.
  773. – Overall the experience was.
  774. – I’ve done it enough times that
  775. it was to be expected.
  776. It was not good nor bad.
  777. The money made it there.
  778. I’m grateful for the process,
  779. but I’m really excited for the day
  780. when I can just do it on my phone.
  781. – Well that’s it for this round.
  782. Thanks again for your participation and contributions.
  783. We’ve thoroughly enjoyed reading the comments
  784. and discussing the ideas amongst ourselves as well
  785. which we frequently do after reading them,
  786. we’ll ping each other or talk to each other,
  787. see each other.
  788. – A lot of questions.
  789. I have no idea.
  790. What do I say?
  791. – These are really great insights,
  792. that you all are sharing which we appreciate.
  793. Now moving on in our next module,
  794. we’re gonna explore cybersecurity and crime,
  795. which will build on what we’ve already covered
  796. in modules one and two,
  797. so we look forward to seeing all of you again
  798. in our next roundup after module three.

Module 3 Cybersecurity and Crime

3.0 Module 3 Introduction

  1. – Welcome back.
  2. In this module we’re going to explore
  3. a really interesting part of FinTech
  4. that frequently ends up in news reports.
  5. Cybersecurity and digital crimes.
  6. The ubiquity of technology and our reliance on
  7. in daily life makes cybersecurity a really important
  8. and fascinating topic.
  9. – Now, I’m sure you’ve seen reports of hacks
  10. and personal information of millions being exposed.
  11. Or perhaps you’ve even been a victim of cybertheft
  12. or other digital crime yourself.
  13. Now as devices, accounts and other aspects
  14. of our everyday life become more interconnected,
  15. the convenience that we gain is also balanced
  16. by the necessity for cybersecurity.
  17. Now, for many institutions, cybersecurity is somewhat like
  18. the story of Sisyphus in Greek mythology.
  19. Now, if you’re familiar with Sisyphus,
  20. he was sentenced to roll a large rock up a hill
  21. that would roll back down after it got to the top
  22. at the end of the day.
  23. And this forced him to start over again,
  24. day after day after day.
  25. Now, similarly institutions are under near constant attack
  26. by cyber attackers, with new threats always appearing.
  27. So who is responsible for thwarting these threats
  28. and protecting user data?
  29. – And for all the benefits we believe FinTech’s rise
  30. will create, FinTech’s potential for good is also tempered
  31. by the potential for it to be used for illicit purposes.
  32. Given that, it’s really important for us to consider
  33. these risks through the principles of trust,
  34. accountability, proximity, privacy and cultural lag
  35. that have served as touchstones throughout the course.
  36. So in this module we want to explore topics
  37. of cybersecurity and digital crimes
  38. and their importance in considering FinTech
  39. through some movie-like, but actual true stories.
  40. So to get us started, we’re gonna look at
  41. a billion dollar bank heist.

3.1.1 Case Study – Billion Dollar Bank Heist

  1. In February 2016,
  2. at Bangladesh Central Bank’s headquarters in Dhaka,
  3. something occurred that laid bare the
  4. profound weakness in the global financial system.
  5. When banks move money around the world,
  6. they use a system called SWIFT –
  7. Society for Worldwide Interbank Financial
  8. Telecommunication – which is a consortium that
  9. operates a trusted and closed computer network for
  10. communication and payment orders between banks.
  11. And today, SWIFT is used by over 11,000 financial
  12. institutions and more than 200 countries and territories
  13. around the world.
  14. And one of them is Bangladesh Central Bank
  15. – BCB – with its headquarters in Dhaka, Bangladesh.
  16. On a daily basis, staff members at BCB would go into a
  17. highly secured room with closed-circuit
  18. security cameras, log into SWIFT and dispatch
  19. payment orders with encrypted communication.
  20. 8000 miles away though,
  21. the New York Federal Reserve Bank
  22. is the gatekeeper of much of world banking, and
  23. hosts accounts for 250 central banks and governments
  24. – including the BCB.
  25. When the New York Fed receives a payment order,
  26. it follows the instructions
  27. and sends the money to the recipient.
  28. At the same time, it sends a confirmation
  29. back to the center, which in this case is BCB,
  30. marking the transaction completed.
  31. This process happens all around the world,
  32. every single day, with about $5 trillion dollars
  33. being directed via SWIFT.
  34. And the system is designed to be unbreachable.
  35. On Thursday, February 4th, 2016,
  36. 35 payment orders using the credentials of the BCB’s
  37. employees were sent via SWIFT to the New York Fed.
  38. 5 of them went through, but the other 30 requests
  39. were blocked as the Fed system had detected
  40. a sensitive word in the recipient’s address
  41. and therefore flagged those transactions as suspicious.
  42. The next day, a total amount of $101 million
  43. was successfully transferred from BCB’s account
  44. to several accounts in Sri Lanka and the Philippines.
  45. But in the SWIFT operation room in Dhaka,
  46. it was quieter than usual.
  47. The printer was malfunctioning,
  48. so none of the confirmation letters got printed.
  49. They didn’t think much of it,
  50. assuming it was a small mistake,
  51. and were going to fix it the next day.
  52. After spending hours on Saturday getting
  53. the printers to work, the 35 payment requests
  54. caught the BCB employees by surprise, and the SWIFT
  55. communication system was still not working.
  56. Assuming they were mistakes, the BCB employee
  57. tried to contact the New York Fed via email,
  58. phone and fax to cancel the transactions,
  59. but the Fed was shut down for the weekend.
  60. On the following Monday, BCB was able to get
  61. the SWIFT communications system working again.
  62. And it was not until then that they realized
  63. that the most daring bank robbery ever attempted
  64. using Swift had happened, four days ago.
  65. It would prove to be the most severe breach
  66. yet of a system designed to be unbreachable.
  67. It turned out that the hackers had installed
  68. malware on BCB’s servers that had sent the
  69. 35 payment instructions and which deleted
  70. any incoming notices of the SWIFT
  71. confirmation messages.
  72. And, when the Fed was back in business that Monday,
  73. BCB was able to reach out and ask them to block the
  74. money transfer – but, it was too late and the money had
  75. already been sent to the recipient banks.
  76. So they sent SWIFT messages to the Philippines bank,
  77. RCBC, but it was a public holiday in the Philippines,
  78. so they would not be read until Tuesday, February 9th.
  79. And by that time, the money had already been
  80. transferred out.
  81. Some funds were transferred to Sri Lanka
  82. and those funds were later recovered because of
  83. a misspelling in a word in the instructions,
  84. which triggered an alert at the local bank.
  85. But the 81 million USD that went to the Philippines
  86. was not recovered.
  87. That money were sent to four fake accounts
  88. at a small Manilla branch of a bank called RCBC.
  89. And from these accounts, the money was taken out
  90. and laundered through the Philippines casinos
  91. – never to be recovered.
  92. Now at the time, the Philippines casinos were not
  93. covered by anti-money laundering laws,
  94. and so it was a nearly impossible to track the money.
  95. As of today, most of the money is
  96. still nowhere to be found.
  97. Other similar cyber crimes have been reported
  98. elsewhere, such as in Vietnam and Ecuador,
  99. and other cases will likely come to light
  100. – and the hackers, however, have yet to be identified.

Additional Readings

  • Zetter, K. (2016). That Insane, $81M Bangladesh Bank Heist? Here’s What We Know. Wired Magazine. Retrieved from https://www.wired.com/2016/05/insane-81m-bangladesh-bank-heist-heres-know/ 
  • Hammer, J. (2018). The Billion-Dollar Bank Job. The New York Times. Retrieved from https://www.nytimes.com/interactive/2018/05/03/magazine/money-issue-bangladesh-billion-dollar-bank-heist.html

3.1.2 Case Study – Billion Dollar Bank Heist: Proximity

  1. So if I’m thinking like an old school movie
  2. where you got a cowboy and he goes in
  3. and he’s like robbing a bank, you know,
  4. he can only take away what he can physically carry.
  5. In fact, that’s kind of the component in a lot
  6. of these movies with bank heists and stuff is
  7. like the physical weight of the money is actually
  8. a challenge and they have to like kind of balance
  9. the risk of getting caught, and getting away…
  10. – Speed.
  11. – Speed, all those things, right.
  12. So it’s really like an entire action-packed scenario.
  13. But what you’re saying is like in this type of a scenario,
  14. they can shoot the money all around the world
  15. to different accounts, maybe some of them land,
  16. maybe some of them don’t, and then they can kind of
  17. pull the money out of the successful transactions
  18. and in this particular instance,
  19. unlike a physical bank heist where you might walk away
  20. with a few thousand dollars,
  21. maybe a few million dollars,
  22. most of this was unsuccessful
  23. and they still received $81 million.
  24. – That’s correct.
  25. So if you think about a traditional bank robbery,
  26. you know, you only get one chance at it, really, at a time.
  27. But… – Unless they’re really bad.
  28. – Unless they’re really bad.
  29. But in the cyber heist situation, even in this example
  30. of the Bangladeshi kind of central bank heist,
  31. there was multiple 30, 40 plus instructions
  32. that they kept trying to get through and so you can think
  33. it’s almost like a spam.
  34. – So in a lot of things that we talk about in this course
  35. when we talk about ethics,
  36. a lot of decisions about whether or not to do something
  37. for a moral reason comes down
  38. to this concept of proximity,
  39. which we talked about earlier in the course.
  40. And in this particular instance, it seems like
  41. let’s say you’re going to walk into a store
  42. and you’re gonna steal a candy bar.
  43. You have to worry about the physical proximity
  44. of walking past the person
  45. and the psychological pressure
  46. of going against kind of these societal rules.
  47. But in this particular instance, you have some guys
  48. from some random country somewhere
  49. that they’re not gonna see the outcome,
  50. they’re not going to understand who they’re harming.
  51. It’s gonna be disparate, right, the opportunity
  52. for catching them is very, very low and as you said,
  53. the cost per infraction is minor.
  54. So once you figure out the code or the script
  55. or whatever necessary to kind of
  56. – You can just keep pinging.
  57. – Just keep doing it over and over and over again
  58. and then if something hits, you could get $81 million.
  59. – And to your point, because of that, in the situation
  60. you explained, because the distance geographically
  61. that you have from location of the crime
  62. to where this person might be,
  63. it removes that kind of connection with humanity
  64. which then makes it easier
  65. to perpetrate certain illicit activities.
  66. – So that then brings up another question.
  67. Cause it’s not just the psychology
  68. of moral decision-making
  69. but it’s also the very practical elements of enforcing
  70. the laws. – Yeah.
  71. – I mean, it must be that it’s incredibly hard as FinTech
  72. gets better and more efficient,
  73. if the criminals get better and more efficient with the
  74. utilisation of FinTech,
  75. it’s gotta be harder for enforcement of these rules.
  76. – That’s right, and that’s absolutely correct.
  77. So one of the large initiatives that many nation states,
  78. many governments are trying to rely on now is kind of
  79. the cyber security cooperation so between countries
  80. because the nature of what you described,
  81. it could be criminal activities,
  82. it could be other forms
  83. of access to data that we may want to control,
  84. it’s very difficult to coordinate information investigations
  85. as well as prosecute people potentially doing wrong
  86. so even in this Bangladeshi kind
  87. of this bank heist situation, you know,
  88. the Bangladeshis involved,
  89. United States was involved,
  90. the Philippines was involved,
  91. Sri Lanka was involved,
  92. and so because of this
  93. kind of transnational aspect of this,
  94. it’s very difficult for particular single nation states
  95. to deal with that alone
  96. so there’s been a lot of movement towards
  97. cyber security partnerships and alliances
  98. amongst countries in order to
  99. try to help manage this problem.

Additional Readings

  • Chernenko, E. (2018). Increasing International Cooperation in Cybersecurity and Adapting Cyber Norms. Council on Foreign Relations. Retrieved from https://www.cfr.org/report/increasing-international-cooperation-cybersecurity-and-adapting-cyber-norms

3.1.3 Case Study – Billion Dollar Bank Heist: Accountability

  1. So, again, coming back to this.
  2. The kind of the moral decision making.
  3. The psychology of business ethics or FinTech ethics
  4. largely comes down to the idea of how our actions will
  5. impact those that are around us, right?
  6. And, how our actions may, perhaps,
  7. harm someone down the road.
  8. So, in this particular instance,
  9. and, from a legal perspective, the law is trying
  10. to ensure that those who are in a position to stop
  11. bad things from happening, stop it.
  12. And, then those that are harmed,
  13. they then receive some type of redress for their harm.
  14. Their compensation, or whatever.
  15. So, in this type of challenging cyber security situation,
  16. there’s a really simple question.
  17. Who is actually injured by this?
  18. And, then how do you, then, compensate them,
  19. or help move beyond that?
  20. ‘Cause if you can’t identify,
  21. from a law enforcement perspective,
  22. if you can’t identify who is harmed,
  23. oftentimes it’s gonna be very difficult for a government
  24. to have enough courage,
  25. or to marshal the resources
  26. necessary to really help those people.
  27. So, who is harmed here?
  28. – When you kind of think about
  29. what was the after-effect of this.
  30. You know, one of the things was, Bangladesh ended up
  31. suing this bank in the Philippines,
  32. where this money was transferred to.
  33. Now, ultimately, at the end of the day,
  34. this bank in the Philippines, may or may not have
  35. followed proper procedure.
  36. But, I think it’s very difficult to say
  37. that they were the perpetrators of the actual crime.
  38. And, somehow, they’re being held accountable
  39. for a minor mistake relative to the magnitude
  40. of the actual crime.
  41. And, so, to your point, I think there’s a large amount
  42. of disconnect,
  43. because of being able to hold people
  44. who are actually responsible,
  45. and being able to hold them accountable
  46. is pretty difficult.
  47. – So, then that brings up another related question.
  48. Is the idea…
  49. There are multiple parties along the way
  50. that are touching this transaction.
  51. Right?
  52. So, you have Bangladeshi, maybe regulators.
  53. You have Bangladeshi bank officials.
  54. You have the US Fed,
  55. and those who are touching it on the US side.
  56. You have Filipino banks, regulators, et cetera.
  57. And, really players all around the world.
  58. So, who is in the best position to actually stop this?
  59. And, who should be responsible
  60. for this type of a transaction?
  61. – So, I think there’s a lot of debate around that.
  62. And, to be honest, I don’t know if that’s actually settled.
  63. I think, for crimes committed in particular countries…
  64. So, if we could identify the source country of where
  65. the hacking occurred,
  66. that would of course be a locus of the crime.
  67. So, you would maybe have some prosecution there.
  68. Apparently some of that money was, of course,
  69. sent to the Philippines, and then kind of cleaned,
  70. or laundered, through casinos in the Philippines.
  71. And, so, it seems there may have been some sort of
  72. criminal activity there.
  73. And, of course, for people who use that money,
  74. and put it into the casinos,
  75. they may have been involved in this.
  76. But, they may not have but they have just been engaged
  77. and they may not have understood the full magnitude
  78. of where the money came from.
  79. Who knows.
  80. But, then again, there’s a locus of the…
  81. But, each of those crimes,
  82. though they’re part of the larger narrative…
  83. It’s very difficult for a local prosecutor,
  84. say, in the Philippines, or,
  85. wherever else that touches on this, like you were saying,
  86. to connect all the dots.
  87. – Yeah, you wouldn’t even have access
  88. to the information.
  89. If a Philippine official contacted the US Fed,
  90. there’s no way they’re gonna give them information…
  91. Well, it’s unlikely they’re going to give…
  92. – It’d be very difficult.
  93. – Very difficult.
  94. Very difficult.
  95. And, it would require, like national level support.
  96. Okay, so then, in thinking about how we move forward
  97. for these types of things,
  98. especially in terms of considerations of the future…
  99. ‘Cause this is only gonna be easier and easier, right?
  100. – So, let’s move this now to a distributed ledger
  101. type of system with like blockchain,
  102. or other types of cryptocurrencies
  103. that would be involved.
  104. Do you anticipate this type of thing
  105. being more or less likely, from it actually occurring?
  106. And, would it have changed the process
  107. of actually securing the funds,
  108. or finding who was responsible?
  109. – So, at a basic level,
  110. the issue is not the integrity
  111. of the blockchain or the ledger itself.
  112. The issue is…
  113. Once those coins have been distributed,
  114. how you hold them or store them.
  115. And, we have a series of examples
  116. of certain types of wallets.
  117. Or, exchanges being hacked where people are able
  118. to access those coins.
  119. Or, sometimes coins being held for ransom
  120. because people…
  121. they’re able to get hacked
  122. and not be able to, they lose access to them.
  123. And, so that raises another interesting question.

Additional Readings

  • Paul, R. (2018). Bangladesh to Sue Manilla Bank over $81-Million Heist. Reuters. Retrieved from https://www.reuters.com/article/us-cyber-heist-bangladesh/bangladesh-to-sue-manila-bank-over-81-million-heist-idUSKBN1FR1QV
  • Lema, K. (2019). Philippine Court Orders Jail for Former Bank Manager over Bangladesh Central Bank Heist. Reuters. Retrieved from https://www.reuters.com/article/us-cyber-heist-philippines/philippine-court-orders-jail-for-former-bank-manager-over-bangladesh-central-bank-heist-idUSKCN1P40AG

3.1.4 Case Study – Billion Dollar Bank Heist: Cultural Lag

  1. – So, David, what did we learn from this case about
  2. cybersecurity and crimes?
  3. – Well, what I’m learning is that
  4. it’s really complicated.
  5. The cross-border nature of it means
  6. it’s incredibly difficult to enforce.
  7. It’s not just about the money,
  8. but there is reputational damage.
  9. There is embarrassment for governments,
  10. embarrassment for people.
  11. It’s really broad and wide-scale.
  12. There’s very little risk
  13. for the people that are actually
  14. going out and committing these crimes,
  15. and the cost to them is very, very little,
  16. so they can just spam these things out there
  17. and still make a really significant amount of money.
  18. So in this case, they failed
  19. the vast majority of the time,
  20. but they still walked away with $81 million.
  21. In a normal bank heist, that’d be like the biggest
  22. bank heist of all time.
  23. – Sure.
  24. – One thing that I don’t completely understand is
  25. what are some of the failures
  26. that allowed this to happen in the first place?
  27. I mean, how does this even happen?
  28. How do you lose $81 million?
  29. – Yeah, yeah.
  30. So the interesting thing about this is,
  31. when we think about cyber crimes,
  32. so frequently, obviously, there’s
  33. a technological component,
  34. and advances in technology will
  35. kind of create more opportunities or different methods
  36. in which to steal money or, as we’re going to look later,
  37. to steal data,
  38. but,
  39. if we think about this Bangladeshi bank heist situation,
  40. it’s really interesting because it wasn’t just
  41. about the technology.
  42. It was a confluence of different factors.
  43. There was people involved.
  44. There were processes that failed.
  45. There was old equipment
  46. that should’ve been updated that wasn’t,
  47. and so a confluence of these things actually led
  48. to this really significant outcome,
  49. and so I think, in a lot of situations,
  50. if companies, banks, and governments
  51. do, use a lot of the basics of security,
  52. making sure you have good anti-virus software,
  53. making sure you’re trying to filter out as much
  54. malware and viruses as possible,
  55. making sure you’re using firewalls.
  56. I mean, these are things that weren’t actually happening
  57. in this Bangladeshi bank case situation,
  58. making sure that you’re having updated equipment,
  59. making sure that your people understand
  60. the processes.
  61. So, for example, after this happened, and
  62. with the Bangladeshi bank,
  63. SWIFT and the New York Fed put out
  64. kind of announcements about,
  65. “Hey,” to be aware of this, and so,
  66. when subsequent attacks or attempts happen,
  67. like in places like Vietnam and Ecuador,
  68. they were actually able to stop
  69. that from occurring,
  70. even though they were trying to, again,
  71. manipulate SWIFT transmission codes and things.
  72. – So I guess the takeaway is,
  73. on the one hand, it’s kind of amazing
  74. that we’ve gotten to the point as society
  75. where, with the push of a button,
  76. money is flying around the world.
  77. This probably happens millions upon millions of times.
  78. – At least. – Every single day, right?
  79. – At least.
  80. – And so, on the one hand,
  81. although we’re focusing on the negative side of it,
  82. it’s actually pretty amazing that
  83. governments, and even in developing countries,
  84. say, Bangladesh, they’re able to bank
  85. with the New York Fed
  86. or transfer money all over the world,
  87. and most of the time, that works out
  88. for the benefit of everybody, right?
  89. But the flip side is,
  90. we also have to be cognizant
  91. of the challenges and making sure we’re staying ahead
  92. of these things because obviously,
  93. one of the big things is that,
  94. we’ve talked about a lot in this course is that
  95. the law and punishments are often
  96. retroactive and reactionary,
  97. and they’re not really able to
  98. stay ahead of these problems,
  99. so I assume we’re going to keep seeing
  100. some aspects of these things going forward.
  101. – That’s right.
  102. – We’ll always be,
  103. unfortunately, we’ll always be responding
  104. to the last crime or last situation,
  105. and that also maybe means unfortunately,
  106. we may not be able to react effectively
  107. to the last situation either,
  108. as it takes time to put in good policy,
  109. as it takes times to get everybody kind of onboard,
  110. to kind of, to the point that you raised earlier
  111. about the difficulty of enforcement,
  112. if these are transnational crimes,
  113. you have to get multiple jurisdictions involved-
  114. – Yeah.
  115. – To kind of police together.
  116. – It actually reminds me of,
  117. have you seen the movie, Catch Me If You Can?
  118. – Yes.
  119. – The Steven Spielberg movie?
  120. – Yeah.
  121. – So, so. – Great movie.
  122. – Yeah, Frank Abagnale is one of,
  123. considered one of the greatest fraudsters of all time-
  124. Right?
  125. And so, recently, within the last few years,
  126. he spoke at Google,
  127. and it was a really widely watched
  128. video where he was giving a speech about his life,
  129. and it was so compelling and really interesting,
  130. and one of the questions at the end
  131. that kind of struck me was,
  132. somebody from Google asked him,
  133. “Do you think you could’ve been successful
  134. as a fraudster today,
  135. given all the advancements of security and technology?”
  136. And his response, quite famously, was,
  137. “It’s easier to be a fraudster today
  138. than ever before.”
  139. And he said that he would’ve made
  140. so much more money had he tried to commit fraud
  141. in this day in age than it was at the time.
  142. So I think it’s kind of interesting.
  143. As technology’s advancing,
  144. more good things are happening,
  145. but it also widens the door
  146. for people to abuse the system.
  147. – Exactly.

Additional Readings

  • Segarra, L. M. (2017). The Guy From ‘Catch Me If You Can’ Says It’s Much Easier to Scam People Today. CNN. Retrieved from http://money.com/money/4953472/frank-abagnale-catch-me-if-you-can-scams-easier/ 
  • Villasenor, J. (2016). Ensuring Cybersecurity In Fintech: Key Trends and Solutions. Forbes. Retrieved from https://www.forbes.com/sites/johnvillasenor/2016/08/25/ensuring-cybersecurity-in-fintech-key-trends-and-solutions/#7a65af1f35fd
  • Frank Abagnale: “Catch Me If You Can” (2017). Talks at Google. Retrieved from https://www.youtube.com/watch?v=vsMydMDi3rI [VIDEO]

3.2.1 Case Study – Apple v. FBI

  1. Okay so, we talked a lot about data
  2. and the importance of data,
  3. but who’s responsible for protecting it?
  4. As we consider this question,
  5. let’s think about another situation.
  6. In December 2015, unfortunately, there
  7. was a terrorist attack in San Bernardino, California.
  8. The two attackers were eventually killed and the
  9. authorities recovered an Apple iPhone
  10. from one of the attackers.
  11. The FBI, however, were unable to access the
  12. iPhone because it was encrypted,
  13. which basically means there was a security password
  14. needed to enter the phone.
  15. The problem the FBI faced was that if they
  16. entered the wrong password a certain number of times
  17. the information on the phone would be totally erased.
  18. So the FBI went to Apple and asked them to decrypt the
  19. phone allowing the FBI
  20. to access to the information inside.
  21. From the FBI’s perspective,
  22. they thought that this was important
  23. because the information was necessary
  24. for their investigation, and could even prevent a possible,
  25. future terrorist attack.
  26. From Apple’s perspective however, they thought that
  27. this could potentially lead to what lawyers call a
  28. “slippery slope” — basically a precedent that
  29. might ultimately lead to greater intrusion and
  30. other privacy issues for its users.
  31. As a result, Apple rejected the FBI’s request
  32. to provide access to the locked iPhone.
  33. Now, this was so important to the FBI that they
  34. actually went to court to try and compel Apple
  35. to provide access to the phone.
  36. After some posturing though, the case was never tried.
  37. And so we don’t really know the exact answer
  38. to the legal question of whether the security risk
  39. was enough to compel Apple to
  40. open the phone.
  41. But this story raises a lot of very interesting questions
  42. that we need to consider.
  43. So, for our students – in this situation,
  44. let me turn the question to you,
  45. how do you feel about Apple decision?
  46. And why do you feel that way?

Additional Readings

  • The Apple-FBI Debate Over Encryption. (2016). NPR. Retrieved from: https://www.npr.org/series/469827708/the-apple-fbi-debate-over-encryption 

3.2.2 Case Study – Apple v. FBI: Trust

  1. – So Dave how about in your situation,
  2. how do you feel about this?
  3. – Well if I understand things correctly.
  4. On the one hand, you have a really challenging scenario
  5. where as a government you’re trying to prevent crime.
  6. Alright, and one of the things
  7. we’ve talked about in this course,
  8. Is that any type of criminal prevention is largely reactive.
  9. And so as a criminal agency
  10. or a law enforcement agency,
  11. you want to be as proactive as you can and predictive
  12. as you can so that you can stop things from occurring
  13. in the first place right.
  14. But the flip side is um I
  15. It’s a scary thought to think that a government
  16. would have the right to access our personal data
  17. within a smartphone at any time simply
  18. because they demand it.
  19. If you think about what is contained
  20. in our smartphones now,
  21. it’s not just the text that we send,
  22. although that’s significant,
  23. it’s not just our images,
  24. although there’s a lot of those.
  25. It’s where you go every single day,
  26. what advertisements you stopped to look at,
  27. what payments you’re making,
  28. whose in your social network
  29. and who do you communicate with.
  30. And so the idea that
  31. the government would demand that
  32. is actually you know, kind of challenging.
  33. – That’s interesting but what about the argument
  34. that people might make,
  35. is by virtue of us using certain applications
  36. on our phones or by using the phone to do banking
  37. or other things that determine our location,
  38. our transaction history,
  39. our social relationships.
  40. There could be people that argue
  41. and companies that actually make this claim,
  42. by the the virtue of the fact that
  43. you’re using our product,
  44. we have access to that data.
  45. So how can we distinguish between that situation
  46. and governments wanting that data too
  47. because it seems like we give a lot of that data
  48. willingly almost to companies,
  49. but why are we not necessarily willing to do that
  50. when it comes to governments?
  51. – It’s a good question,
  52. I think first of all,
  53. I’m not sure I agree that most people give it willingly,
  54. they kind of give it ignorantly.
  55. – There we go, yep.
  56. – And so a lot of more progressive laws
  57. including in the EU, for example,
  58. are now saying that you have to
  59. opt-in to that data sharing,
  60. rather than opting out
  61. because again from a psychological perspective,
  62. we talked about a lot in this class and others,
  63. people are lazy
  64. and we often will agree to things
  65. that we don’t fully understand,
  66. especially when that information is hard to process.
  67. I think that’s definitely true with smartphones.
  68. One of the things with smartphones is that
  69. they took the world by surprise
  70. and a lot of our behaviours evolved within
  71. that ecosystem before we really
  72. understood the consequences of those things.
  73. So we’re seeing with Facebook
  74. and other things,
  75. companies that have in many ways taken advantage
  76. of our ignorance and our laziness
  77. and so now the law once again
  78. is retroactively going back
  79. and restricting that,
  80. and I think that’s relevant
  81. because although it’s important to think that
  82. a company can monetize our data,
  83. and that’s something we should
  84. be talking more about.
  85. It’s the reason why I think most people
  86. would be more concerned
  87. about the government knowing it is
  88. because they have the ability
  89. to conscript you to even further.
  90. Right, they have the power not only to
  91. give you freedom,
  92. they have the power to take that freedom away.
  93. And so I think for many people,
  94. the idea of government,
  95. any government having free access
  96. to that type of data is kind of Orwellian,
  97. 1984 type of scary amount of data.

Additional Readings

  • Kakutani, M. (2017). Why ‘1984’ Is a 2017 Must Read. The New York Times. Retrieved from https://www.nytimes.com/2017/01/26/books/why-1984-is-a-2017-must-read.html
  • Harrington, M. (2017). Survey: People’s Trust Has Declined in Business, Media, Government, and NGOs. Harvard Business Review. Retrieved from https://hbr.org/2017/01/survey-peoples-trust-has-declined-in-business-media-government-and-ngos

3.2.3 Case Study – Apple v. FBI: Cultural Lag

  1. And are there different examples of countries
  2. over the last few years, all over the world
  3. that have instituted certain types of these
  4. kinds of controls and filters?
  5. And do we feel that those things are necessary?
  6. And do we feel that those are the type of things
  7. that as users,
  8. we should somewhat willing at least
  9. hand over to the government?
  10. If not, where do we draw that line?
  11. – Yeah, so it’s tricky.
  12. Like a lot of people would look at certain governments
  13. and they characterise them as authoritarian
  14. or very aggressive in terms of their policing of peoples,
  15. but the reality is,
  16. London is one of the most
  17. surveilled cities in the world.
  18. – They have more CCTV cameras.
  19. – Yeah, they have more CCT cameras, I think per capita
  20. than any city in the world.
  21. And this is true in New York, DC.
  22. I lived in DC.
  23. There’s cameras everywhere.
  24. And so it’s true to a certain extent that we’ve already
  25. given up so many of these concepts of freedom
  26. and so much private information
  27. that we don’t even realise,
  28. I think to a certain extent what the effects
  29. of that will be.
  30. And so, yeah, I think as society we definitely
  31. need to take into account the freedoms
  32. we have already given away.
  33. But now as we’re looking at these things retroactively,
  34. it’s not just about us as the data providers
  35. or the government as the eventual user,
  36. but there’s these companies in between.
  37. And, and I think the question of FinTech,
  38. it is what is the moral obligation of those companies
  39. in the middle in terms of
  40. protecting that data in terms of,
  41. and I think it comes back to a fundamental question,
  42. do we own the right to our own private data?
  43. Right, I think that’s why the distributed ledger
  44. and blockchain I think are so appealing to many people
  45. is the idea that private data is one of the biggest
  46. and most important commodities
  47. in the world right now.
  48. And yet we individuals who the data is about,
  49. we have no control over who uses it, who, who sees it,
  50. who sells it, et cetera.
  51. And so I think,
  52. these are the types of things we have
  53. to figure out as society.
  54. – And I think those arguments
  55. and debates that you rightly described that,
  56. as societies we need to figure out,
  57. I think apply not just for FinTech,
  58. which of course they do,
  59. but as well as other advances
  60. who are making technology
  61. particularly in biotechnology.
  62. So if we think about the commercialization of
  63. DNA testing
  64. and as you provide different samples to companies
  65. so they can check your family’s history
  66. or health markers in your DNA.
  67. There’s a lot of debate and questions about,
  68. hey, once you hand that over to these companies
  69. who actually owns that DNA at that point?
  70. Because that is so uniquely yours yet,
  71. can you hand that over to somebody else?
  72. And again, that’s a very other, very similar situation
  73. to what you had described before about
  74. we sign up for apps ignorantly,
  75. in an ignorant way of not understanding
  76. what rights we’re giving.
  77. Similarly in these kind of DNA tests
  78. and other types of kind of biotechnology,
  79. kind of commercial
  80. kind of projects that are going on,
  81. there’s a lot of ignorance around,
  82. hey, what are you actually giving up to these
  83. companies?
  84. – Well, that’s a great tie and actually to this Apple case,
  85. because in the state of California, right?
  86. There’s that great case where the guy,
  87. they had this unsolved serial killer case
  88. and the police for decades
  89. didn’t know who this guy was
  90. and then he takes one of these blood tests
  91. or it wasn’t a blood test, but he did a DNA test,
  92. sends it in, didn’t realise that the data
  93. that was produced wasn’t going to be private
  94. and it ended up on a public database
  95. and that data from his DNA was actually used
  96. to find him and capture him as a serial killer.
  97. So many people in society, we’re debating this issue.
  98. There’s, on the one hand, they were super excited
  99. that you have this like.
  100. – Murderer off the street
  101. – Murderer, yeah, he’s off the streets, right?
  102. And he’s been caught.
  103. But the flip side is they’re like, wait, holy wait,
  104. how did they get his information?
  105. How did they know it was him?
  106. Because he put this information up and didn’t realise
  107. that it could lead to exactly identifying him as the killer.
  108. And so I think this is the, excuse me,
  109. this is the dichotomy that we face now,
  110. is that the utilisation of smartphone technology
  111. and all these technologies
  112. has opened up so many avenues in life
  113. that we’ve never been at, communication
  114. and financial transactions and
  115. data and knowledge
  116. and so many cool things.
  117. But we are simultaneously
  118. ourselves becoming the product.
  119. And I don’t think we really understand
  120. the repercussions of that yet.

Additional Readings

  • DNA From Genealogy Site Used to Catch Suspected Golden State Killer. (2018). CBC News. Retrieved from https://www.cbc.ca/news/world/dna-from-genealogy-site-used-to-catch-suspected-golden-state-killer-1.4637726
  • Kerry, C. F. (2018). Why Protecting Privacy is a Losing Game Today – And How To Change The Game. The Brookings Institution. https://www.brookings.edu/research/why-protecting-privacy-is-a-losing-game-today-and-how-to-change-the-game/
  • Hill, S. (2019). Should Big Tech Own Our Personal Data? Wired Magazine. Retrieved from https://www.wired.com/story/should-big-tech-own-our-personal-data/
  • Heller, N. (2018). We May Own Our Data, But Facebook Has a Duty To Protect It. The New Yorker. Retrieved from https://www.newyorker.com/tech/annals-of-technology/we-may-own-our-data-but-facebook-has-a-duty-to-protect-it

3.2.4 Case Study – Apple v. FBI: Accountability

  1. So if we go back to your original question
  2. about whose responsibility then, is it to protect data,
  3. where do you fall on that?
  4. – So importantly, as a consumer,
  5. I’m increasingly realising that,
  6. number one, it has to start with me.
  7. As a parent, I’m realising that I’m trying
  8. to do a better job of educating my children
  9. about privacy and data than my parents did,
  10. not because my parents are bad,
  11. but they didn’t have to face these issues.
  12. – Challenges.
  13. – Yeah and actually studies have shown,
  14. when they’ve looked at morality and decision-making
  15. psychology,
  16. young people today are very similar
  17. in their stances on moral decision-making
  18. in almost every regard, except for one.
  19. And the one big difference today, versus say,
  20. one or two generations ago,
  21. is the perception of privacy.
  22. And young people today do not have this same standard
  23. or high regard for privacy,
  24. because they’ve grown up on a stage.
  25. It’s a public stage, right?
  26. Every day it’s Instagrammable
  27. and if you didn’t click it, it didn’t happen
  28. and so we’ve given up so much of our own privacy
  29. that it’s no longer even perceived as a moral issue
  30. anymore,
  31. because there is no other option, in their mind.
  32. So I think it definitely starts with the consumer.
  33. So I’m gonna flip this around on you though,
  34. because from an Apple perspective,
  35. Apple’s business model, now, publicly,
  36. from a marketing perspective,
  37. is hey, buy our products
  38. because we don’t sell your data, right?
  39. And this is in part true, because a lot
  40. of their revenue model is based on the hardware
  41. that they produce, right?
  42. So do you think that they actually care about data
  43. and they’re using this as kind of a moral high ground,
  44. or is it just that they know that they’re making most
  45. of their revenue off the hardware anyway
  46. and so this is just kind of a marketing ploy?
  47. – Yeah, that’s an interesting question and to be honest,
  48. I don’t think those two things are mutually exclusive
  49. and certainly, for Apple executives,
  50. I’m sure they feel like they do perhaps have
  51. a moral high ground,
  52. because of this experience that they had
  53. and because they currently,
  54. their business model doesn’t require
  55. them to monetize their data they have of users
  56. to send out to external sources
  57. by the nature of their business and the ecosystem
  58. in which users operate,
  59. since it’s almost all within Apple.
  60. – Yeah, it’s enclosed, yeah.
  61. – It’s enclosed and so at a certain point,
  62. if that business model changes, will their ability
  63. to take that moral high ground change?
  64. Perhaps.
  65. The profit incentive is, can be really powerful
  66. for listed companies.
  67. – See, this is where I struggle with this
  68. and so my history, as you know,
  69. is I used to be Apple’s outside counsel, within Asia.
  70. They had a direct phone line to me
  71. when I was still working for my law firm
  72. and I had a lot of close interaction
  73. with various people within Apple
  74. at the time of the iPhone, the iTouch.
  75. These things were first being introduced into Asia
  76. and one of the things that
  77. immediately became apparent
  78. is that the profit margin was paramount
  79. and the profit margin on the hardware, at the time,
  80. was well over 60%, right?
  81. Just imagine, that’s like cosmetics type of profit margin.
  82. And the interesting thing about Steve Jobs
  83. and the business model that he made
  84. is that it was fully self-contained
  85. because he wanted people in an ecosystem
  86. so they essentially, would have to…
  87. – just use Apple products.
  88. – Yeah, because you have to give up a lot
  89. to leave that ecosystem, right?
  90. So whether it’s the earbuds, or all these things,
  91. they can only work within the Apple ecosystem
  92. and so now, interestingly enough, you now have,
  93. say the Android model, which is the exact opposite,
  94. where it’s like, flood the market with these things,
  95. as broadly as you can, open the software…
  96. – And they don’t care who the hardware are from.
  97. – They don’t care who it is, because their revenue model
  98. is based off of the selling of the data
  99. and so I’m not so sure that Apple is altruistic
  100. or moral in this way.
  101. I think if they had built a model that could
  102. generate revenue in a way that’s similar to Google,
  103. while simultaneously maintaining the profit margins
  104. on their hardware,
  105. I think, I mean they’ve proven that profit is paramount.
  106. – And so again, kind of to what I was saying, I think.
  107. When push comes to stuff,
  108. particularly for listed companies,
  109. publicly listed, trading companies,
  110. that profit motivation frequently overwhelms
  111. any kind of moral principles
  112. that CEOs and companies may espouse to,
  113. which is unfortunate.
  114. Now, we do see certain leaders, now more and more,
  115. taking on more of an activist approach,
  116. beyond just their business, into certain aspects
  117. of political activism and having a voice
  118. when it comes to certain moral issues,
  119. which, I think, perhaps is a good thing
  120. and I think it’s necessary for such leaders
  121. to contribute that voice to these kind of debates
  122. in order for us, as users,
  123. them as the producers of these products,
  124. as well as for governments,
  125. to collectively try to think about
  126. how we can manage these issues around data
  127. and protecting data.

Additional Readings

  • Kezer, M., Sevi, B., Cemalcilar, Z., & Baruh, L. (2016). Age Differences in Privacy Attitudes, Literacy and Privacy Management on Facebook. Cyberpsychology: Journal of Psychosocial Research on Cyberspace, 10(1). Retrieved from https://cyberpsychology.eu/article/view/6182/5912

3.2.5 Case Study – Apple v. FBI: Privacy

  1. So we we kind of fly by the fact that,
  2. the US government asked Apple,
  3. to build a backdoor in the first place.
  4. Right.
  5. And so I think many of us maybe just assumed,
  6. like, shouldn’t the very idea,
  7. that they can build a backdoor into these things,
  8. and can if they wanted to,
  9. they could access this data essentially,
  10. whenever they want, you know,
  11. assuming the agreements would allow it.
  12. What about the idea of a technology company,
  13. having that ability in the first place?
  14. There are huge markets globally,
  15. in terms of you know, smartphone technology,
  16. and other types of FinTech data,
  17. where people maybe don’t even realise,
  18. that these back doors do exist already.
  19. And in some cases,
  20. it’s almost a free flow of information,
  21. from the user to the company,
  22. and then eventually to the government.
  23. So is that even moral or ethical in the first place?
  24. – If we just kind of extrapolate from Apple,
  25. to other kind of technologies,
  26. either apps,
  27. hardware,
  28. you know, wallets that may hold cryptocurrencies,
  29. you know, almost all of these have some
  30. sort of protection,
  31. password, keywords for whatever it may be.
  32. And some of those immediately have
  33. what you call backdoor,
  34. and some of them don’t,
  35. so frequently a lot of wallets,
  36. once you lose that password, your key to get it.
  37. That’s it, you lose it and the coins,
  38. or whatever value you had in there is now gone.
  39. So more broadly do we have,
  40. Or do governments then have a responsibility
  41. to police that?
  42. – Right.
  43. – That’s an interesting question.
  44. I think if we tie this back into the Apple case,
  45. the question then is,
  46. at what point does the need for,
  47. the government to access certain information?
  48. – Security safety.
  49. – Rise to the level where the company,
  50. or privacy is then compelled to do it?
  51. Now we have lots of history,
  52. in a lot of countries,
  53. or governments frequently go to somebody say hey,
  54. I know you have this kind of,
  55. You wrote this paper,
  56. the Smithsonian paper, give us that evidence,
  57. or you videotape something,
  58. give us that evidence.
  59. Right. So that, you know,
  60. this is not uncommon,
  61. and a lot of type of criminal investigations.
  62. So there is that analogy that can be made to that.
  63. But, you know,
  64. the aspect of what we’re talking about this this data,
  65. some of which is, is still private,
  66. and may not necessarily always be directly tied,
  67. to a compelling government interest,
  68. should the government automatically,
  69. have some way to access that information.
  70. – Yeah,
  71. I think that’s questionable.
  72. But I think if we look through history,
  73. there have been situations where governments,
  74. do demand that of companies.
  75. – Yeah. And it’s, I mean,
  76. this is like the new form of national security argument,
  77. right? So before national security,
  78. was guns and bombs etc.
  79. Now, it’s knowing where people are,
  80. and it’s mass behaviour modification.
  81. So the bike sharing apps,
  82. I think, are good examples of this.
  83. Where you know, it’s not just about money,
  84. going between a customer and a vendor,
  85. is the idea of understanding where
  86. people are at all times.
  87. So my last question for you is,
  88. when you see a case like this,
  89. do you have an iPhone?
  90. – I do.
  91. – You have iPhone, okay.
  92. So when you have a case like,
  93. – And am wholy inside ingrained,
  94. – Within the Apple ecosystem,
  95. You’re in the Apple ecosystem,
  96. so good, they’re not using your data
  97. do cases like this,
  98. make you question carrying a smartphone?
  99. Not that you’re going to commit any,
  100. – That’s right.
  101. – Wrong acts or anything but,
  102. but do I mean did you ever like stop,
  103. and say or for your kids,
  104. you’ve got two kids, right?
  105. Did you ever stop and say like,
  106. do I want a smartphone in my kids’ hands?
  107. – Yeah, so this is when you ask that question.
  108. That’s immediately what my mind went to.
  109. And I think for a lot of young people,
  110. some of them may not be,
  111. mature enough in certain ways to understand,
  112. kind of the issues that surround some of these things.
  113. So like you were talking about this study,
  114. about how the younger generation, about privacy,
  115. compared to, you know, the generation before.
  116. I think it’s important that we educate our children,
  117. our young adults about the impacts,
  118. of this kind of the use of smartphones,
  119. the use of particular applications on smartphones,
  120. and technology in general.
  121. I think that kind of education,
  122. will make other aspects of forming good policy better.
  123. I think we know from different things,
  124. that we’ve touched on in the course,
  125. that informed consumers,
  126. will generally make better decisions.
  127. And if they make better decisions,
  128. then we can more fully utilise the positive aspects
  129. of these new financial technologies.
  130. as opposed to being used by them.
  131. – Would do you force your kids,
  132. to give you their password to their phone?
  133. You think?
  134. – You know, fortunately, they’re young enough.
  135. – But just in the future, do you think?
  136. – I don’t know.
  137. – Because here,
  138. because this is a microcosm of a broader question.
  139. So when we’re talking about our children,
  140. I, my children don’t have smartphones,
  141. that they carry around,
  142. but I would want to know,
  143. what they’re doing and I would see,
  144. that as my responsibility to keep them safe as a parent.
  145. And if you take that, extrapolate it out
  146. to the government,
  147. that’s exactly their point.
  148. Right? So I like your answer,
  149. because you’re saying it’s all about educating,
  150. and then letting people make informed decisions.
  151. But I think when you are in the position of authority,
  152. and you’re trying to protect people,
  153. oftentimes that kind of,
  154. desire to protect maybe overcomes, yeah.
  155. – And so my hope would be, and I honestly don’t know,
  156. my hope would be my efforts to try to educate,
  157. or effective in the fact that,
  158. I can hopefully trust them enough,
  159. where they could use this technology initially.
  160. And if there’s potentially an issue,
  161. then we may have another discussion about the use.
  162. – And then we’ll hack their phone.
  163. – That’s right.
  164. And then we’ll ask Apple to hack the phone.
  165. But I think a nice way to wrap this up,
  166. and to kind of get to the complexity of this issue,
  167. there was a quote by a guy named
  168. General Michael Hayden,
  169. who was a former director
  170. of the National Security Agency, United States,
  171. as well as the Central Intelligence Agency,
  172. and with respect to this whole situation with Apple,
  173. in the San Bernardino terrorists case.
  174. He commented in a report that this case,
  175. this may be a case where,
  176. we have got to give up some things in law enforcement,
  177. and even counter-terrorism ,
  178. in order to prevent to preserve,
  179. this aspect of our cybersecurity.
  180. And so, he, I think he captured that well,
  181. in the sense that there’s a balance of national interest,
  182. but there’s also a competing interest,
  183. of how do we want to secure data like
  184. this cybersecurity,
  185. and this is an ongoing debate,
  186. that I think will continue to have and hopefully through,
  187. our students thinking through this course,
  188. and these questions can also contribute,
  189. to debate wherever they may be.

Additional Readings

  • Limitone, J. (2016). Fmr. NSA, CIA Chief Hayden Sides with Apple Over Feds. Fox Business. Retrieved from https://www.foxbusiness.com/features/fmr-nsa-cia-chief-hayden-sides-with-apple-over-feds

3.3.1 Case Study – The Sony Hack

  1. In an age where data is supposedly the new oil,
  2. FinTech companies have raised serious concerns
  3. about data protection and compliance,
  4. especially in light of the recent spate of
  5. global cyberattacks as the presence of valuable
  6. personal information makes FinTech companies
  7. increasingly attractive targets for cybercriminals.
  8. Okay so, let’s dive into another story.
  9. On Monday, November 24, 2014,
  10. a typical week begins at the
  11. Sony Pictures Entertainment’s headquarters in
  12. Culver City, California – right next to Los Angeles.
  13. As employees begin arriving work they realize that this is
  14. far from an ordinary work day.
  15. The image of a skull flashes on every employee’s
  16. computer screen, accompanied by a threatening
  17. message warning that “this is just the beginning”.
  18. The hackers, calling themselves the Guardians of Peace,
  19. go on to say that they have obtained
  20. all of Sony’s internal data”,
  21. and if demands are not met,
  22. they will release Sony’s secrets.
  23. And because of the hack,
  24. the whole Sony network was down, rendering the
  25. Sony employees’ computers completely inoperable.
  26. The hack had brought the global corporation
  27. to an electronic standstill.
  28. On November 27, the hackers leaked
  29. five upcoming Sony films online.
  30. The is the first of what were to become many
  31. subsequent leaks in the days and weeks to follow.
  32. Speculations began arising that North Korea
  33. may be responsible for the attack, in retaliation
  34. for the movie The Interview which depicts
  35. an attempted assassination on North Korea’s leader,
  36. Kim Jong Un.
  37. Back in June, when the trailer was first released,
  38. North Korea had called movie an “act of war”,
  39. saying that it would carry out strong and merciless
  40. countermeasures.
  41. About a week later, the FBI officially began
  42. an investigation, and Sony hired a cyber-security firm to
  43. carry out an investigation of the attack.
  44. In the following days more leaks are published online,
  45. including the salaries of top-paid executives
  46. and more than 6,000 employee names,
  47. job titles, home addresses, salaries and bonus details.
  48. And reports also arose that Sony was fighting back,
  49. using hundreds of computers in Asia to execute a
  50. “denial of service”, a so-called DDOS attack,
  51. on sites where its stolen data
  52. were being made available.
  53. On December 7, C-Span reported that the hackers had
  54. stolen 47,000 unique Social Security numbers
  55. from the Sony computer network.
  56. With this data being leaked on the internet,
  57. other cyber criminals instantly swooped in
  58. – leading to various fraud, theft and other
  59. problems for Sony’s employees.
  60. On the same day, North Korea denied all involvement
  61. – but called it a “righteous deed of the supporters
  62. and sympathizers of the country”.
  63. And beyond just coping with the cyberattack
  64. and the various leaks, Sony were also challenged
  65.  
  66. on other fronts, such as by former employees
  67. filing class-action lawsuits against the company
  68. which they argued had taken inadequate safeguards
  69. to protect personal data.
  70. And Sony also faced battles with the media,
  71. demanding the media to stop reporting on the
  72. stolen data, claiming that journalists were actually
  73. abetting criminals in disseminating
  74. the stolen information.
  75. On December 16, Sony hackers threatened a
  76. 9/11-style attack on theatres that showed
  77. “The Interview”, which led to theatres
  78. across United States cancelling their premieres,
  79. and Sony pulling all TV advertising, for the movie.
  80. Urged by President Barack Obama,
  81. to not give in to the hackers’ demands,
  82. Sony instead jumped directly to a digital release.
  83. On December 19, the FBI officially implicated
  84. North Korea in the Sony hack.
  85. North Korea proclaimed its innocence
  86. and in the following days,
  87. heated rhetoric emerged from both countries.
  88. Now, other security experts had some doubts
  89. about whether North Korea was actually involved
  90. in the hack.
  91. Another theory puts the finger at angry former
  92. employees, whereas others say it was the work
  93. of outside hacking groups that simply used
  94. the release of The Interview as cover for their actions.
  95. Now the challenge that we have is that,
  96. the Sony hack was not a single anomaly,
  97. as we are witnessing a huge influx in data
  98. breaches across the world.
  99. Now just to give you a few examples:
  100. In 2013, 40 million credit and debit card records were
  101. stolen from Target.
  102. And, just before the Sony hack,
  103. 56 million credit card numbers from
  104. Home Depot customers were also breached.
  105. In 2017, some of the biggest companies in America
  106. were also hacked, such as Yahoo, Uber and Equifax.
  107. In the case of Equifax, the hack compromised the data
  108. of around 143 million Americans,
  109. that’s about half of the US population
  110. and well over half of the adult population.
  111. And the hackers had gained access to over
  112. 200 thousand credit cards.
  113. And in 2018, we know that Marriot had a data breach
  114. affecting 500 million guests.
  115. So, with all these massive data breaches globally,
  116. important questions naturally arise around our
  117. key principles of trust, proximity, accountability,
  118. cultural lag and privacy.
  119. Like, who owns your data
  120. – and who is protecting it?
  121. Can you trust them?
  122. How may data protection be regulated?
  123. With recent technological advancements,
  124. are we able to protect our own data and privacy?
  125. We’ll discuss these questions,
  126. further, with you, in our next session.

Additional Readings

  • Dawson, F. (2015). What the Sony Hack Can Teach About Cyber Security. Forbes. Retrieved from https://www.forbes.com/sites/freddiedawson/2015/02/27/what-the-sony-hack-can-teach-about-cyber-security/#773860fa18a0
  • Leskin, P. (2018). The 21 Scariest Data Breaches of 2018. Business Insider. Retrieved from https://www.businessinsider.com/data-hacks-breaches-biggest-of-2018-2018-12 

3.3.2 Case Study – The Sony Hack: Trust

  1. Okay, so I mean it’s interesting
  2. and that allowed people to maybe
  3. view Spider-Man a little bit early,
  4. but what is the connection between this and FinTech?
  5. This isn’t really like a FinTech case.
  6. – So that’s a great question, I think this case,
  7. the Sony hack leads into kind of broader questions
  8. about data and security, and I think those are things
  9. that we wanna talk about on the context of this module.
  10. But I think one, in terms of FinTech,
  11. we like to think if something has ‘crypto’ in front of it,
  12. somehow it’s maybe more secure
  13. than other forms of finance or data
  14. or other spheres of finance that we may be involved in.
  15. – It’s like we don’t understand it
  16. so we assume other people don’t too.
  17. – Perhaps, perhaps, right. – Yeah, yeah.
  18. – We have cryptocurrency, Bitcoin being probably
  19. the most representative of cryptocurrencies
  20. at the moment.
  21. And even then we know that participants related to
  22. the cryptocurrency market have been hacked, right?
  23. So probably the foremost example of that is
  24. there is an exchange called Mt. Gox
  25. that was based in Japan,
  26. at the time it handled most of the cryptocurrency
  27. transactions of the world,
  28. they ended up being hacked
  29. and losing their Bitcoin valued at billions of dollars
  30. and eventually they went bankrupt.
  31. And so that’s a very direct example
  32. of how cybersecurity, still are very relevant,
  33. even to things in FinTech that we think may be secure.
  34. Right, so that’s the first point.
  35. I think the second point is, a little bit more broader
  36. in the sense that as FinTech,
  37. and different types of applications of it
  38. become much more widespread,
  39. and populations that maybe didn’t have access
  40. to traditional forms of finance
  41. now do through not having to go to a
  42. brick and mortar bank
  43. but accessing banking services through their phones,
  44. right?
  45. – Yeah, yeah.
  46. – You would assume that a lot of these populations
  47. maybe are not as technologically sophisticated.
  48. And so as they get exposed to these new technologies,
  49. their concept of cybersecurity
  50. and how to protect their data will become an issue too,
  51. and they can potentially be a population at risk
  52. in terms of hacking and cybersecurity.
  53. So this is why this is a very important topic
  54. that goes hand-in-hand or in parallel
  55. to advances in technology and FinTech.
  56. – So one of the things that I think often comes up is
  57. when these types of things happen,
  58. it’s who owns the data,
  59. but also who’s responsible for this.
  60. So in the Sony case, what happened to Sony?
  61. Did they get in trouble for this at all?
  62. Was there any liability on their part?
  63. – Well there’s no criminal liability that we know of,
  64. but we know from a civil liability standpoint
  65. meaning somebody filing lawsuits
  66. that there were a number of lawsuits
  67. against Sony saying,
  68. hey, you should have been more responsible
  69. for how you protected that data than you were.
  70. – So is this customers, employees,
  71. shareholders, all of the above?
  72. – So I think, my understanding, the majority of the cases
  73. that were brought against Sony
  74. were generally former employees
  75. who probably had much more data with Sony
  76. because they had employee records
  77. and different personal information
  78. that ended up getting exposed.
  79. But if you think about it,
  80. that information is about you individually,
  81. or individual people,
  82. but it’s being held by somebody else,
  83. so who actually owns that data?
  84. – Right.
  85. – Does Sony own that data?
  86. Do you own that data if it’s about you?
  87. Because that idea of ownership
  88. then links into idea of responsibility,
  89. which then links into idea of protection.
  90. And then understanding that gives us
  91. a more comprehensive approach to
  92. trying to figure out who actually has
  93. a responsibility to protect all of this data.
  94. – And it doesn’t seem like,
  95. either from a regulatory standpoint,
  96. certainly from an ethics standpoint,
  97. we haven’t really answered those questions yet, right?
  98. – Really quick on the ownership and liability,
  99. it’s very interesting that in many situations,
  100. certain kind of social media,
  101. social networking services,
  102. that users will post different types of personal data,
  103. be it pictures, be it stories, be it videos,
  104. frequently these social media services,
  105. these social networking services
  106. will actually say they don’t own the data.
  107. – Yeah, yeah.
  108. – But they will say that they are licensing the data-
  109. – They don’t want the responsibility of ownership.
  110. -and then that creates all kinds of questions,
  111. well if we own the data now,
  112. then do we have to pay you for that data?
  113. But, so frequently the way they
  114. navigate this somewhat thin line
  115. is you still own your data but you’ve
  116. licensed it to us by virtue of you using our platform.
  117. And then in that situation,
  118. they can use it like they have owned it,
  119. but maybe they don’t have
  120. the same responsibilities as protecting it.
  121. And so this again raises a number of questions.

Additional Readings

  • Ellis, R. (2014). Lawsuits Say Sony Pictures Should Have Expected Security Breach. CNN. Retrieved from http://www.cnn.com/2014/12/20/us/sony-pictures-lawsuits/
  • Raymond, N. (2015). Sony to Pay Up to $8 Million in “Interview” hacking Lawsuit. Reuters. Retrieved from https://www.reuters.com/article/us-sony-cyberattack-lawsuit/sony-to-pay-up-to-8-million-in-interview-hacking-lawsuit-idUSKCN0SE2JI20151020
  • Pham, S. (2019). Former Mt. Gox chief Mark Karpeles acquitted of most charges in major bitcoin case. CNN Business. Retrieved from https://edition.cnn.com/2019/03/14/tech/mark-karpeles-mt-gox/index.html

3.3.3 Case Study – The Sony Hack: Accountability

  1. So one of the challenging things
  2. with this type of a data breach
  3. is it relates to time, right?
  4. It’s often very difficult for the parties involved
  5. to know when they were hacked
  6. and then after the fact,
  7. it often takes time for them to
  8. then react or even publicly,
  9. you know, tell people that the hack occurred, right?
  10. So how does that like impact these scenarios?
  11. – So time is a really interesting variable
  12. when it comes to these cybersecurity matters.
  13. So like, as you mentioned, frequent companies
  14. don’t know or only know later that they’ve been hacked.
  15. So then at that point,
  16. if something happened many years ago–
  17. – So it’s not like in TV, where it’s like,
  18. I’ve been hacked and all the lights are gone.
  19. – That’s right.
  20. – Everyone’s typing on the same keyboard
  21. at the same time.
  22. – Well, I think maybe in certain situations
  23. that could happen, I don’t know,
  24. but I imagine in a lot of situations,
  25. you know, a company has been hacked
  26. or data has been exposed,
  27. either intentionally or unintentionally,
  28. and they might not know about it
  29. for a prolonged period of time,
  30. and I think we’ve seen a lot of examples of that
  31. even the Mt. Gox situation that we talked about
  32. a few minutes ago is a situation like that,
  33. where the hack might have happened years before,
  34. and so, there’s a lot of
  35. uncertainty around this time element,
  36. particularly when did it happen?
  37. But then, on the back, let’s assume the company
  38. has found out, a day later, the same day,
  39. whenever it is, then how did they react?
  40. I mean, it seems from some studies
  41. that on average, companies take at least six months
  42. to react and kind of figure out
  43. what their next step is.
  44. One challenge that companies have is that
  45. each of these companies that goes through this
  46. has very different capabilities.
  47. So certain companies because they may have
  48. really good management processes, good leadership,
  49. good operational control, and, you know,
  50. teams that can, you know,
  51. some sort of emergency situation comes out
  52. and they have some sort of protocol
  53. that they go through.
  54. But there are a lot of companies,
  55. the reality is most companies are probably
  56. not really well managed,
  57. and maybe don’t know what to do when that happens,
  58. and then you have very interesting incentives
  59. that are particularly for listed companies.
  60. Companies that are publicly traded,
  61. that have this kind of issue, and, you know,
  62. debates that occur probably at the highest levels,
  63. both in the boardroom as well as amongst
  64. the chief executive officer level,
  65. about when they should reveal certain information
  66. should it be before a certain deadline
  67. in terms of quarterly cut-offs and things like that,
  68. because maybe they don’t want to impact share price,
  69. and, you know, – Or their job.
  70. and so there’s a lot of incentives,
  71. or disincentives that go into wanting to publicise
  72. or not publicise the information as well,
  73. and so this is a great challenge that we have,
  74. and then it goes back to this again,
  75. going back to the idea of who’s in charge
  76. of protecting this data then, right?
  77. Because if you’ve discussed,
  78. somehow whatever method the company has
  79. aggregated or compiled this data,
  80. then do they have responsibility
  81. or stewardship over that, right?
  82. You know, I think from an ethical perspective,
  83. we would say, yeah, right,
  84. if something has been left in your care,
  85. then you would assume that there’d be
  86. some level of responsibility to protect
  87. what has been left in your care.
  88. Now, it seems that that’s not always the case
  89. from the behaviour of business leaders.
  90. – Yeah, and one of the things I mean,
  91. let’s assume that North Korea was involved,
  92. let’s just say, and it’s not clear that they were,
  93. this is one of those cases that when it happened,
  94. you know, it kind of brought home to me this idea that
  95. personal data,
  96. is in many ways as important to national security
  97. as a border might be,
  98. and I had never really thought about that before.
  99. So what are kind of security implications
  100. from a data standpoint?
  101. – Yeah, so if we take a step back, you know,
  102. there are a lot of people who feel data will be the fuel
  103. of the not just FinTech but of perhaps
  104. the Fourth Industrial Revolution.
  105. So you know, people talk about
  106. all the technological advances–
  107. – AI with lives–
  108. – That’s right, all of this will be empowered
  109. or further enhanced by large amounts of data,
  110. and so at the core, you know, companies,
  111. large technology companies in United States
  112. and in China,
  113. other places a world, you know, a lot of them
  114. are branching out and building very large platforms,
  115. where users are participating on the platform
  116. through various different services
  117. that these companies offer,
  118. but at the heart of all of that,
  119. is that these companies are now having the opportunity
  120. to get a more fuller or comprehensive view of usage
  121. of data,
  122. and more richer data that can be used to kind of
  123. develop new products, but as well as
  124. develop profiles of people.
  125. Now, we already know that in China, for example,
  126. that at government level, they’re trying to develop
  127. social credit,
  128. so tie into your question,
  129. that has very direct implications on aspects of
  130. how that credit or that data then may be used.
  131. – National policy.
  132. – That’s right.
  133. – There’s examples of this where government officials
  134. have said they’ll use this in determining visa rights–
  135. – Who can leave the country
  136. and who can’t leave the country–
  137. – What jobs you can get, what you can study in university
  138. whether you can be a journalist, a lawyer, et cetera.

Additional Readings

  • Cost of a Data Breach Study (2018). IBM. https://www.ibm.com/security/data-breach
  • Osborne, C. (2015). Most Companies Take over Six Months to Detect Data Breaches. ZDNet. Retrieved from https://www.zdnet.com/article/businesses-take-over-six-months-to-detect-data-breaches/
  • Zetter, K. (2014). Evidence of North Korea Hack is Thin”. Wired Magazine. https://www.wired.com/2014/12/evidence-of-north-korea-hack-is-thin/
  • Marr, B. (2019). Chinese Social Credit Score: Utopian Big Data Bliss Or Black Mirror On Steroids? Forbes. Retrieved from https://www.forbes.com/sites/bernardmarr/2019/01/21/chinese-social-credit-score-utopian-big-data-bliss-or-black-mirror-on-steroids/#25b8222048b8

3.3.4 Case Study – The Sony Hack: Cultural Lag

  1. Okay, one of the issues on the ethics side
  2. that we typically lump in is the regulation of this, right?
  3. And so part of the issue with data and data protection
  4. is that globally,
  5. there are different standards everywhere, right?
  6. And the nature of this data …
  7. Okay, again, we keep coming back to the idea
  8. that money is not in a vault anymore, right?
  9. It’s code somewhere, and so it’s information
  10. going in and out of servers.
  11. And when the data leaves a jurisdiction,
  12. it’s not like it’s physically leaving, right?
  13. But the server is maybe hosting data
  14. for someone in Hong Kong, in the Philippines,
  15. or travelling,
  16. the information could be travelling via lines,
  17. through the U.S. system.
  18. What do you think is the responsibility
  19. from a regulatory standpoint for a consolidation?
  20. How can we, as society, have standards
  21. for these types of things when you have this
  22. spaghetti bowl-like mixup of regulations globally?
  23. – Yeah, so that’s a really great question.
  24. The reality is, I don’t think anyone
  25. has a great answer to it.
  26. On one hand, the way the U.S. tries to extend
  27. their regulatory reach is basically
  28. there’s a number of laws, financial regulation laws that
  29. basically say if you use U.S. dollars for transactions–
  30. – Every bank does.
  31. – Which almost every bank, every country,
  32. large company in the world has to do in some way,
  33. then somehow, because of that, you are touching
  34. the U.S. financial system, and if you’ve
  35. committed some sort of crime,
  36. or that transaction is part of a larger network
  37. of maybe illicit transactions,
  38. then you’ve maybe fallen into U.S. jurisdiction.
  39. So there’s a set of regulations that get to this.
  40. Now, but like you’re saying, what if you’re actually
  41. not even using currency, and how does that work?
  42. So that raises a broader set of questions, as well.
  43. I think, by its nature, if we think about cybersecurity
  44. as well as cyber regulation, by their very nature,
  45. those are reactive things.
  46. They will always be reacting to what just happened.
  47. And so it’s very difficult to put bright-line rules in
  48. that say, “Oh, ABCD,” and as a result,
  49. I think as users, and consumers,
  50. and people who will be impacted
  51. by these advances in technology,
  52. we have a responsibility to kind of think
  53. within an ecosystem of the values
  54. and principles that we might want to abide by.
  55. – Yeah.
  56. – Because I don’t think we can …
  57. Continually, what we’re going to find,
  58. we can’t rely on law, and we can’t rely
  59. on governments per se, to be at the forefront
  60. of leading how we want to govern this aspect
  61. of the problem.
  62. – See, but this is the challenge, right?
  63. So, it’s not like a typical negotiation scenario
  64. where I’m going to buy something from you,
  65. and then I get the chance to say, “Well, I want the price
  66. “to go up or down,” whatever, right?
  67. Every single day, we click on potentially hundreds
  68. of websites where we are agreeing
  69. to their privacy policies.
  70. Sometimes you formally have to agree.
  71. A lot of times, it’s hidden behind the scenes.
  72. You’re not even paying attention to it, right?
  73. And so, on the one hand,
  74. that diminishes the value of those things,
  75. so essentially they’re pushing
  76. that burden on us, as the consumer, to say,
  77. “Do you agree to this or not?”
  78. But the challenge is it’s not like anyone
  79. is taking the time to read and understand those things.
  80. And then even if you did, it’s still not like you have
  81. the opportunity to negotiate.
  82. It’s not like you can, say, go to Facebook and say,
  83. “Okay, clause three, line number two,
  84. “I don’t think this is appropriate,
  85. “so let’s work that out”–
  86. – Yeah, the negotiation, it’s if you don’t want to use it,
  87. then don’t use our service.
  88. – So, same with the banking system, right?
  89. The financial system.
  90. Either you’re in or you’re out.
  91. And so it’s not like we really have a choice.
  92. So even if consumers wanted to have a choice,
  93. it’s either do you opt in, or are you gonna
  94. eliminate yourself from this entire system?
  95. – Yeah, so that’s very interesting.
  96. And so we can see some analogies, or similarities,
  97. to maybe a few other types of situations.
  98. So for example, in financial services,
  99. in financial services space, particularly
  100. in the world of derivatives, we have
  101. organisations that kind of market participants
  102. who got together to set a set of ground rules
  103. of how they want to transact with one another
  104. because they didn’t want to have a lack of clarity
  105. or grey area, or didn’t want to wait
  106. for government or law to come in and say,
  107. “This is how it’s gonna be.”
  108. And so I think, from the consumer perspective,
  109. you’re right.
  110. At the individual consumer level, we don’t have
  111. a lot of individual kind of influence.
  112. But I think collectively, there is some influence.
  113. Similarly, I think what we want to try to invite
  114. companies to do is to have these discussions
  115. amongst themselves as industry participants,
  116. as market participants.
  117. How do we want to create a fairer, more secure
  118. ecosystem for the product to be?
  119. Because ultimately,
  120. this is a very long-term game.
  121. But if they don’t have that discussion,
  122. then, in the long run,
  123. it will just become more problematic.
  124. – Yeah.

Additional Readings

  • Prenio, J., & Crisanto, J. C. (2017). Regulatory Approaches to Enhance Bank’s Cyber Security. FSI Insights. Retrieved from https://www.bis.org/fsi/publ/insights2.pdf 
  • Press, G. (2018). 60 Cybersecurity Predictions for 2019. Forbes. Retrieved from https://www.forbes.com/sites/gilpress/2018/12/03/60-cybersecurity-predictions-for-2019/#36f9c1214352 

3.3.5 Case Study – The Sony Hack: Privacy

  1. Start of transcript. Skip to the end.
  2. Getting back to the ethics of this,
  3. the saying goes that,
  4. “If you’re not paying for a service online,
  5. then you are the product.”
  6. – Product, that’s right.
  7. – Right. Yeah.
  8. And so–
  9. – Which is a great line, by the way.
  10. – It’s a great line, yeah,
  11. my students say all the time, you know,
  12. “We use this because it’s free.”
  13. I’m like, it’s not free.
  14. – You’re the product.
  15. You’re the data basically.
  16. – Yeah, exactly right.
  17. So the business model is now no longer
  18. even that thing overtly,
  19. it is the data they’re collecting behind the scenes,
  20. and therefore what they’re doing
  21. with the data behind the fact.
  22. Is there any consensus about the ethicality of that
  23. as a business model,
  24. especially when it’s often
  25. kind of hidden from the consumer,
  26. especially children for example,
  27. like a lot of games are free,
  28. so candy crush, those type of things,
  29. they’re free, they use the same psychologist that
  30. created the gaming system within,
  31. say casinos for example,
  32. to get your mind wrapped around one thing,
  33. I gotta do one more, I gotta do one more right.
  34. Is that somehow,
  35. kind of pernicious or unethical,
  36. or is it just an extension of you know,
  37. people’s weakness.
  38. – Well, I think that raises another great question.
  39. So we know,
  40. most applications that require,
  41. that basically monetize off of data or ads,
  42. and that require active users.
  43. Embed a lot of psychology,
  44. – Yeah.
  45. – Into user interface,
  46. – Totally.
  47. – Into what information comes into your feed,
  48. because overtime, they’re mapping
  49. the things that trigger you basically.
  50. – And, just to clarify when you say using psychology,
  51. essentially you’re saying,
  52. using the weakness that they know that exist within
  53. human behaviour collectively,
  54. – That’s right.
  55. – In order to keep us there.
  56. – From a behavioural science and
  57. psychological perspective,
  58. we know that we are less in control,
  59. than we often think we are.
  60. – Yeah.
  61. – Right,
  62. and there certain triggers,
  63. colours for example, information, sounds,
  64. that tend to have influences on people’s behaviour,
  65. and a lot of these companies that,
  66. particularly social networking sites for example,
  67. you know, spend a lot of time actually actively
  68. thinking about this,
  69. to ensure that users spend as much time as possible,
  70. on their site.
  71. Because as they do that, they’ll use it more,
  72. they collect more data from that and then
  73. are able to feed into the model again.
  74. – So my last question about this from a
  75. data standpoint is,
  76. we’re talking about these implications for us,
  77. what does this mean for the next generation,
  78. especially from an ethic standpoint because,
  79. one of the things that we’ve discussed is that,
  80. the only major difference that they’ve found
  81. kinda between previous generations and this generation
  82. from an ethic standpoint is their perception of privacy.
  83. – That’s right.
  84. – And they’re living in a world without privacy,
  85. essentially right?
  86. – At least in a way we would’ve thought when
  87. we grew up.
  88. – Sure exactly.
  89. And so, you know, how do we perceive
  90. the next iteration of this,
  91. do we think that with distributed ledger technologies
  92. and other blockchain technologies,
  93. will we be able to control our own data,
  94. own our own data, kinda determine what people see,
  95. or is this just going to be a new way to solidify
  96. this power of the data.
  97. – So that creates a very interesting dichotomy
  98. in terms of the future,
  99. because on one hand,
  100. there’s a big pursuit.
  101. Blockchain,
  102. and other technologies are in some respect,
  103. more anonymous, right?
  104. Even though they’re open, they’re also more anonymous
  105. in terms of kind of protecting–
  106. – As anonymous as the system wants them to be.
  107. – That’s right.
  108. And so, in some sense
  109. is some of the FinTech technology,
  110. technologies that we think about now actually creates,
  111. some greater levels of anonymity
  112. than might’ve existed in traditional financial system,
  113. but on the other hand, there’s a lot more information
  114. that was private that is now public as well.
  115. So it’s a very interesting dichotomy that people
  116. would have to live in as they get older,
  117. and I think when we think about our students,
  118. and our children as they grow older,
  119. they’ll live in a world that’s definitely less silo
  120. so thinking about, oh this is a bank,
  121. this is a consumer company,
  122. this is a store,
  123. those kind of distinctions I think will start,
  124. blurring as you alluded to.
  125. – Yeah, so one of my favourite kind of fake news
  126. clips of all time was from this website called
  127. The Onion, and they did a story,
  128. this is like you know, ten or more years ago,
  129. and so it’s very prescient in nature,
  130. but it was talking about Facebook,
  131. and they revealed, fake, this is just a joke,
  132. but they revealed that Facebook was
  133. actually a CIA protocol,
  134. that was a secret programme to get people to post
  135. their private information in a public way,
  136. and they were joking about it
  137. because they were saying like,
  138. they called it Operation Overlord, I think,
  139. and that the leaders on Facebook were
  140. actually CIA operatives, and the idea was
  141. they had been working as a, you know,
  142. an intelligence agency to get
  143. private information on people for so long
  144. and then now they realised that everyone
  145. just post it anyway and they have like logs of
  146. where they’re going and whatnot right?
  147. So the idea, obviously again as a joke,
  148. but the idea being that you know,
  149. we live in a way,
  150. in a society where so many things are open
  151. and even the concept of privacy as you said,
  152. it doesn’t even mean the same thing that
  153. it used to mean.
  154. – Well and to tie that back into
  155. something you mentioned about national security,
  156. so the Facebook example was perfect for that
  157. because we know in the lot
  158. the most recent US presidential election,
  159. there seems to be a lot of very clear evidence
  160. that there were certain elements,
  161. that tied to various entities in Russia that
  162. kind of used Facebook as a platform to try to
  163. influence certain election outcomes
  164. in the United States.
  165. And so, that is very much
  166. this idea of weaponization of data.
  167. And to influence outcomes that have very important
  168. national security considerations, right,
  169. who will be the leader of an important country
  170. in the world.
  171. And so we’re seeing that.
  172. – So the same use of data,
  173. that can make it easy for a large retailer to send you
  174. a personalised coupon,
  175. is the same analysis of data that can also convince you
  176. to pick a certain candidate and
  177. what is supposed to be a democracy.
  178. – That’s right.
  179. – This is a challenge.

Additional Readings

  • Peck, M. (2017). Cheat Sheet: The Trade-Offs of Blockchain Privacy Tools. American Banker. Retrieved from https://www.americanbanker.com/news/cheat-sheet-the-trade-offs-of-blockchain-privacy-tools
  • Moskov, A. (2019). How Blockchain Can Save Our Privacy Before It Disappears. CoinCentral. Retrieved from https://coincentral.com/blockchain-and-privacy/

Module 3 Conclusion

  1. In conclusion,
  2. after all the stories about cyber crime,
  3. illegal use of cryptocurrencies, hacking
  4. and breaches of data privacy,
  5. many people, unfortunately, connect the rise of fintech
  6. with only bad things.
  7. They’ve lost trust in the institutions
  8. and innovators who are driving these changes.
  9. And to be sure, there have been a lot of scary stories
  10. that require immediate attention.
  11. But it’s also true that these new technologies
  12. can change the world in so many positive ways.
  13. So, what do we do?
  14. – Well, once again, society has a choice to make.
  15. From a proximity standpoint,
  16. these concepts may seem so distant
  17. that we don’t really take the time to understand
  18. or even question them.
  19. For example, we accept the terms
  20. and conditions of websites, like iTunes and eBay, so often
  21. that we have become desensitised
  22. and don’t really think
  23. about the potential future implications.
  24. Be honest, how many of you actually read those?
  25. And for innovators, they’re often so distant,
  26. or non-proximate, from the users
  27. that they can’t empathise
  28. with their concerns about data privacy.
  29. – We have this seeming paradox
  30. that pits our legal rights of personal privacy
  31. against the vast efficiencies
  32. and desirability of fintech innovations.
  33. For example, most people love their smartphones,
  34. and even those who don’t really love them,
  35. are reluctant to give them up
  36. because they’ve become so integrated into our lives.
  37. – But after a period of culture like,
  38. we are all now becoming aware
  39. that by carrying around and using our smartphones,
  40. we are giving up some aspects of personal privacy.
  41. And we love the idea of being safe and secure,
  42. particularly from violent terrorist attacks.
  43. But when law enforcement asks large tech firms
  44. to decrypt our smartphones,
  45. that can be quite unsettling.
  46. – But, has the area of privacy already passed?
  47. Have we already given up so much personal data,
  48. via social media, and our Google searches
  49. and purchasing habits,
  50. that these questions about privacy are moot already?
  51. And from an accountability standpoint,
  52. maybe you think the big tech firms and banks are so big
  53. that you can’t do anything about it anyway.
  54. I know that I have become so numb
  55. to the announcement of large data breaches,
  56. that I don’t even really think about them much anymore.
  57. But that probably needs to change.
  58. – In fact, maybe the opposite is true.
  59. Maybe since we are now more exposed than ever,
  60. giving up significant personal data
  61. on a minute-by-minute basis,
  62. we actually need to have even tighter regulations
  63. and controls on the firms
  64. who are collecting, using, analysing and sharing our data.
  65. – Now here’s the part that many of you may not yet realise.
  66. The fact is, that in many ways,
  67. we are not only the consumers,
  68. but are in fact the product
  69. that these large companies are trying to monetize.
  70. How do companies like Facebook and Google,
  71. which allow us to use their main services for free,
  72. make money?
  73. Data.
  74. Our personal data is what drives revenue at these companies
  75. and many others.
  76. – So what should we do?
  77. How do we strike a balance between balancing our privacy
  78. and ensuring sufficient security and data protection?
  79. And who should be accountable
  80. for cyber crimes, data breaches
  81. and other illicit uses of fintech innovations?
  82. – And what we are seeing now is only the beginning.
  83. As 5G connections and quantum computing become more common,
  84. data collection and analytics are only gonna increase,
  85. driving the next iteration of machine learning
  86. and artificial intelligence,
  87. which you’re gonna focus on in the next module.
  88.  

Module 3 Roundup

  1. – Welcome to our roundup for week three.
  2. Can you believe we’re already halfway through the course?
  3. Now, we mentioned this last week
  4. but it bears mentioning again.
  5. We really appreciate all the active participation
  6. in the discussion board.
  7. It has been really dynamic.
  8. I mean, the quantity of the comments has been great,
  9. but more really the quality
  10. of the insights and experiences that have been shared
  11. have really impressed us.
  12. We’ve been really blown away
  13. and there’ve been definitely a few times
  14. where mutually we’ve thought, wow,
  15. it’d be really cool if we could build on the discussions
  16. in a live classroom.
  17. – And we’re also really grateful for those of you
  18. who may have joined the course a little late
  19. but are not any less enthusiastic
  20. in sharing your thoughts, experience, and opinions with us.
  21. This course is really meant to be a continuous discussion,
  22. so wherever you are right now,
  23. please take your time
  24. and we’ll try to respond
  25. to some of the newer comments in the earlier modules
  26. from time to time.
  27. And that being said,
  28. we also really highly recommend
  29. that you read and comment on other people’s posts
  30. and take advantage of the full learning community.
  31. And, as I said in some of the feedback,
  32. the course is really only as good
  33. as the learners who are taking it.
  34. So we really do appreciate you and thank you
  35. and ask you to keep contributing your unique experiences
  36. and help further enrich the course.
  37. – So, we covered a number of really entertaining
  38. but also important cases
  39. in module three,
  40. which we hope compelled you to think
  41. through the implications of new technologies
  42. and how they intersect with crime and security.
  43. From the comments in the discussion forum,
  44. it seems many of you have been thinking
  45. about really similar questions too.
  46. So we want to spend some time addressing
  47. some of the great questions and contributions
  48. that were made.
  49. – But before we jump into that,
  50. just a quick update on enrollment.
  51. So, we’re over 5,000 students now,
  52. which is already way more than we ever imagined.
  53. And from the feedback we received from many of you,
  54. it seems the course has been informative and interesting.
  55. So, if so, please consider sharing it
  56. with your friends, colleagues, family,
  57. within your organisations,
  58. because we really genuinely believe
  59. these questions that we’re considering in the course
  60. are crucial to crafting a better future.
  61. Now, with that out of the way,
  62. on to the comments.
  63. – So, RichardStample had another great comment this week,
  64. which is becoming a pattern, very consistent.
  65. He had a great comment about, hey,
  66. the difference between something
  67. that is retroactive in the law perhaps
  68. versus reactive.
  69. So maybe Dave you could take a crack at that
  70. and just share your thoughts.
  71. – Yeah, so, first, I was actually really impressed.
  72. I made a mistake when I was speaking in that part,
  73. it was during one of the conversations we were having
  74. and kinda riffing back and forth,
  75. and I said that the law is retroactive
  76. and then kind of immediately caught myself
  77. and then said reactionary.
  78. But those are two actually distinctive
  79. and important aspects of the law.
  80. So, retroactive, if you’re not familiar,
  81. means that,
  82. using it in the legal context means
  83. that if you create a law,
  84. then it begins and is put into force at an earlier time.
  85. So let’s say, as of today,
  86. there’s a new law that says taxis are no longer legal.
  87. And if it was retroactive, you could say,
  88. and the date takes effect from January 1, 2019.
  89. And so therefore anyone
  90. who actually was operating a taxi service
  91. from January 1, 2019
  92. would have in some way violated the law.
  93. There is this legal concept and it does happen,
  94. but typically a retroactive law in this fashion
  95. would be something that’s more positive.
  96. So, amnesty, for example.
  97. If you enter a country illegally
  98. and if you’ve been here for a certain amount of time,
  99. then they could say,
  100. as of this dates, people that have entered,
  101. or sorry, if anyone’s entered the date after this time
  102. then they’re retroactively kinda forgiven.
  103. So, what I meant to say, though,
  104. and what the conversation was really about
  105. was how the law is reactionary, is reactive,
  106. meaning that the law tends to…
  107. We tend to create laws in order to solve existing problems
  108. after they occur.
  109. And this is good because we don’t,
  110. if you think of a minority report standpoint,
  111. you don’t wanna make people,
  112. punish them for crimes they haven’t committed,
  113. and you don’t want the law to kinda predict
  114. what is going to be happening.
  115. That’s not what the law is for.
  116. But what that also means is if we’re always reactionary,
  117. if we’re always reacting to things
  118. that have happened in the past,
  119. then from an ethics standpoint
  120. it means that often you can have criminals or bad actors
  121. or just normal people doing what is technically legal
  122. but maybe a little bit unethical,
  123. and then the law is never going to be able
  124. to stay ahead of that.
  125. So, appreciate that, for pointing that out.
  126. It was something every time I listened to that segment
  127. I would always cringe a little bit
  128. because I knew I made a small mistake.
  129. But it is an important concept of the law
  130. and it gets into kinda cultural lag
  131. and why the non-material aspect of the culture like the law
  132. is very slow to change
  133. whereas the material aspects of culture like technology
  134. changes very quickly
  135. and there are often gaps in between the two.
  136. Okay, so the second comment that we wanted to point out
  137. is from joergHK.
  138. Again, a frequent commenter,
  139. we really appreciate all of your additions to the course.
  140. So, one of the things, it’s two comments in one basically,
  141. and he said that, he asked, he said,
  142. “Please don’t move fast and break things.”
  143. And for those that are not familiar
  144. with where he’s coming from,
  145. this is actually kind of a modification of a statement
  146. that was made popular by Mark Zuckerberg who said that,
  147. you know, in Silicon Valley–
  148. – So, the founder of Facebook.
  149. – Exactly, the founder of Facebook.
  150. He said, “We move fast and break things.”
  151. That was kind of the mentality of Facebook
  152. and has been adopted by many startup founders
  153. in the Silicon Valley region.
  154. And so joergHK was saying, again,
  155. bringing cultural lag into this,
  156. he was saying, “Please, just take a minute, slow down.”
  157. Break things, disrupt things, sure.
  158. But let’s take some time
  159. and make sure that as we’re doing so,
  160. we’re doing it in a way that’s kind of thoughtful
  161. in that regard.
  162. He also talks about, though,
  163. how there are some ways from a regulatory standpoint
  164. that governments and institutions
  165. can advance technology forward
  166. while maybe minimising the risk of the breakage,
  167. the disruption.
  168. And he mentioned something called a sandbox.
  169. And so I wanted to maybe ask you
  170. to kinda describe what is a sandbox,
  171. specially from a fintech standpoint,
  172. and how are they being applied.
  173. – Yeah, so, sandboxes are interesting.
  174. They’ve kind of become a little,
  175. come into Vogue in a sense in a lot of places in the world
  176. as financial markets try to understand how we’re gonna cope
  177. with these new technologies that come in
  178. with respect to current regulations.
  179. Because current regulations were made
  180. in the context of kind of a traditional market structure,
  181. and there’s aspects of new technologies, like fintech,
  182. that will come in and change how that happens.
  183. And so some creative regulator somewhere,
  184. I’m not exactly sure,
  185. said, “Hey, let’s have something called a sandbox.”
  186. This is not like the toy that you played in
  187. when you were little.
  188. – Although that’s what it’s named after.
  189. – But that’s what it’s named after.
  190. It’s this idea of let’s wall off this space
  191. and allow these innovators to play in this space,
  192. not subject or constrained by certain regulatory measures,
  193. and let’s see what happens.
  194. – Yeah, give them their toys and let them play
  195. and see what happens.
  196. – And that will give us indications
  197. of perhaps how we should regulate certain behaviours.
  198. But if we put current regulation on them,
  199. they may actually not be able to grow
  200. and it may not be applicable,
  201. but we wouldn’t know that
  202. because they’re gonna be constrained to begin with.
  203. And so by putting in a regulatory sandbox,
  204. that allows these kind of new companies
  205. that are kind of on the fringe
  206. of certain regulatory rules,
  207. it gives them an opportunity to expand a little bit,
  208. as well as for regulators to observe what happens
  209. and how that occurs.
  210. But at the same time the observation thing is important
  211. because they may not be subject
  212. to the current rules and regulations
  213. but they in theory should be observed.
  214. The effects,
  215. the impact that they’re having on customers in particular,
  216. how is that working.
  217. So one thing we were discussing
  218. in a class we had earlier today,
  219. a live class that Dave Bishop and I had today
  220. related to fintech was
  221. the success of regulatory sandboxes
  222. in particular jurisdictions like Singapore and Asia.
  223. And one of the things we thought was really great
  224. was that Singapore it seems it has coordinated
  225. a number of different policies
  226. in conjunction with their sandbox initiative,
  227. even from a few years ago.
  228. I remember hearing about what Singapore was doing
  229. a few years ago.
  230. In Hong Kong,
  231. we’ve only recently gone down this regulatory sandbox route
  232. and I think we’re still trying to coordinate this
  233. a little bit more with broader policies
  234. and different things that regulators are trying to do.
  235. So I think that’s quite an important,
  236. we think that’s quite an important piece to have
  237. if you really wanna cultivate innovation.
  238. Because if you have people
  239. in companies that put out new products
  240. but if they can’t test that in a neutral way
  241. not subject to kind of the same regulations
  242. that a fully licenced and staffed brokerage firm or bank
  243. would have to subject themselves to,
  244. then that can be very onerous on these new innovators.
  245. – Yeah.
  246. Great question, thank you.
  247. Or great point.
  248. – So, one of the other awesome comments
  249. that we got in the discussion board this week
  250. was about privacy.
  251. So, CelesteMunger,
  252. I think, it looks like she’s from Canada,
  253. talked about her thoughts on privacy and she thought,
  254. one of the things that she started off with was
  255. that privacy was an illusion.
  256. – Privacy is an illusion.
  257. Period.
  258. – And then she ended with an example of DNA testing,
  259. which is another large area
  260. where a lot of privacy concerns have been raised recently
  261. and will continue to be raised,
  262. particularly in the biotech space
  263. and technologies related to genetics and things like that.
  264. So, on that note,
  265. what do you think about these issues of privacy?
  266. I mean, they’re super important,
  267. but how do we think about them?
  268. – So,
  269. she is probably right to a certain extent.
  270. Privacy is an illusion from the standpoint
  271. of a fully traditionally private life
  272. because we are constantly, as she pointed out,
  273. being recorded
  274. and we are ourselves giving out significant information.
  275. But at the same time
  276. I’m not sure that’s what the definition of privacy means
  277. from a rights standpoint.
  278. I think if you think of the right to be forgotten,
  279. if you think about the right
  280. to be able to pull back your information,
  281. if you think about the right
  282. to be able to do what you want in your own home,
  283. which really is fundamental to many other rights
  284. in terms of human sexuality and having children,
  285. there’s so many aspects of that,
  286. it doesn’t mean
  287. just because we don’t have as much privacy in our lives
  288. as we go out in the public
  289. doesn’t mean that it’s necessarily eroded privacy
  290. as a right.
  291. And so I think this is the part that we as society
  292. have to do maybe a better job of really thinking through.
  293. As peter-nyc pointed out in a previous module
  294. in one of his comments,
  295. the concept of privacy, specially privacy as a right,
  296. is in and of itself a relatively recent legal construct.
  297. It only started happening a little over a century ago,
  298. and even up until the 1970s–
  299. – In the United States.
  300. – Well, correct.
  301. – And then globally later.
  302. But we’re taking about a US legal context.
  303. – Yeah, in the US legal context,
  304. it really started 150 or so years ago
  305. but really wasn’t institutionalised or even codified
  306. until the 1970s, actually,
  307. when some US Supreme Court cases,
  308. including the famous Roe v. Wade
  309. which dealt with abortion rights,
  310. where they said there was an implied right to privacy
  311. in the US Constitution,
  312. and so within the United States
  313. and other primarily Western democracies
  314. there was this codified right to privacy.
  315. And so, in that context,
  316. in those nations where that right still exists,
  317. I think we do still have a very strong right to privacy,
  318. although that does seem to perhaps
  319. maybe being eroded slightly.
  320. So I think there’s a distinction
  321. between the legal right to privacy
  322. versus how much information about us
  323. is kind of flowing out on a daily basis.
  324. And so it’s complicated
  325. but it’s important to be able to parch those things
  326. because as regulation comes in we want, I think,
  327. I should maybe speak for myself,
  328. I want more regulation
  329. on dealing with my private information,
  330. but that’s more–
  331. – That might not be a technical right to privacy.
  332. – Exactly.
  333. That’s like personal information
  334. that I wanna make sure is being used responsibly
  335. and that I understand what’s happening with it.
  336. But it’s separate
  337. from my overarching constitutional right to privacy
  338. which means when I’m in my own home I can do what I want,
  339. that type of thing.
  340. – I think that’s an important point.
  341. That potentially the link
  342. between traditional forms of right to privacy,
  343. which I think initially were like,
  344. in your own home,
  345. people shouldn’t be able to just come in
  346. and see what you’re doing.
  347. And that linked to people shouldn’t just be able
  348. to come in and look on your phone.
  349. Perhaps there is a link there,
  350. but I don’t think that link is traditional in a sense.
  351. And maybe that will evolve over time.
  352. But we tend to use the vocabulary, a right to privacy,
  353. in various forms
  354. and I think, to your point,
  355. that has evolved over the last century or so
  356. in what form that takes.
  357. So, early on it was about what you’re doing in your house.
  358. But even then, certain activities,
  359. physical activities, sexual activities,
  360. weren’t necessarily protected for many centuries and decades
  361. in America.
  362. And then that changed.
  363. And then when we get to women’s rights,
  364. that idea of what I do with my body,
  365. is that a right to privacy?
  366. What right does that fall under?
  367. Because in a lot of situations
  368. these are not explicitly stated so they’re inferred rights.
  369. And so, again, this will I think continue to evolve
  370. given how technology is evolving.
  371. – Yeah, and I do think it’s interesting
  372. because although DNA is not a fintech technology,
  373. it is a very interesting example that she’s provided.
  374. For those that are not familiar,
  375. just to give you an example
  376. of a case that happened in the US within the last few years.
  377. There was a gentleman in California
  378. who did one of these private DNA-testing services.
  379. You scrub the inside of your mouth,
  380. you put it into a vial,
  381. you send it in,
  382. and then they provide you DNA information about yourself.
  383. And maybe what he didn’t realise at the time
  384. was that as part of the user terms of service,
  385. you also agree for that DNA information
  386. to be uploaded on a public website
  387. which then becomes,
  388. I guess, I don’t know all the details,
  389. but public domain or something.
  390. – Yeah, usually.
  391. Because I actually did get my DNA tested
  392. on one of the commercial providers,
  393. I don’t know, a few years ago.
  394. It serves a lot of different purposes.
  395. It’s interesting from a genealogy perspective,
  396. you can kinda see where your forefathers came from,
  397. you can see, there’s health indicators that can be helpful.
  398. And there’s some questionability
  399. about how super high accurate they are
  400. but it gives you a general sense.
  401. But yeah, one of the things I remember
  402. as I was doing research was
  403. they take you through a series of terms and conditions
  404. and they basically say,
  405. “Would you allow your data
  406. “to be included in certain databases
  407. “that will be used and tested and whatnot.”
  408. And I always opted out of those
  409. because I was kind of aware of those issues.
  410. So, basically, the fundamental question is,
  411. where will that end up eventually?
  412. And you actually don’t know that,
  413. it’s not clear to you.
  414. And until that was clear to me I didn’t want to participate
  415. so I opted out as much as I could.
  416. And I think, to your point, this person didn’t do that,
  417. which ended up– – Do you know what happens?
  418. – Yes. – Okay, go ahead and finish.
  419. – Well, so the FBI apparently was looking for a,
  420. I don’t know if it was the FBI,
  421. but police authorities were looking for someone
  422. who apparently had killed a number of people.
  423. – A serial killer, yeah.
  424. – And they had some DNA evidence
  425. and through basically linking of genealogy,
  426. so genetics of family trees,
  427. they were able to figure out,
  428. oh, this person was probably related to this person,
  429. and eventually figured out it was this particular individual
  430. who had actually provided his own evidence himself
  431. that ended up leading to his arrest.
  432. – Yeah, so it was really kind of amazing
  433. and yet scary at the same time.
  434. So a lot of people that read this story were like,
  435. “Wow, that is so cool.”
  436. – Like CSI, TV show. – Yeah, CSI.
  437. I mean, these cases had gone cold and I think in the 1980s,
  438. so it’d been 30 or more years.
  439. The idea is the killer is long gone,
  440. there’s no way we’re gonna find them.
  441. And then boom, you’ve got him.
  442. But then it’s like, oh, wait, wait a minute.
  443. This guy sent in his vial of DNA,
  444. he was not expecting
  445. this to be ran through a criminal database.
  446. And so, again, there was a good outcome,
  447. you found a serial killer,
  448. but I think it caused a lot of people to think,
  449. now, wait a minute,
  450. what’s gonna happen 10, 20, 30 years into the future?
  451. What if they want to, whatever,
  452. because of ideology or race.
  453. – Yeah, and this becomes one of these double-edged swords
  454. because I think probably when we were in law school,
  455. a number of law schools in the United States
  456. got involved in Project Innocence,
  457. where they were basically trying to represent people
  458. who they thought were falsely imprisoned.
  459. And one of the ways that they were able to help
  460. a lot of these people that were incarcerated,
  461. usually minorities,
  462. socially-economically very kind of disadvantaged,
  463. was through the advances in DNA technology.
  464. So, oh, actually this evidence that you have
  465. is not this person.
  466. And then they were able to free a number of people.
  467. So, again, of course you don’t want people
  468. to be incarcerated wrongly,
  469. but at the same time,
  470. you can see a lot of different situations
  471. where a proliferation of this kind of data
  472. where it becomes commercialised or commoditized
  473. and then it ends up in the hands of actors
  474. who are using it maybe not for nefarious purposes
  475. but it’s for profit,
  476. then it ends up becoming a problem.
  477. So you can easily think of people that need insurance,
  478. yet an insurance company getting genetic markers,
  479. and even if you’re not sick they say,
  480. “Well, you’ve got an X percent chance
  481. “that you’re gonna get sick with this disease
  482. “so we’re not gonna insure you.”
  483. So these are not the type of necessarily, I don’t think,
  484. the outcomes that we want.
  485. Or at the very least
  486. these are the type of things we wanna think about
  487. before just wholesale, let’s open this up.
  488. – Yeah, so, again, like many things in the course,
  489. double-edged sword.
  490. There’s a lot of benefits that can come from this,
  491. probably some unintended negative consequences,
  492. and so we need to be very thoughtful about these things
  493. as we roll them out.
  494. – Our heartfelt thanks again
  495. for your participation and contributions.
  496. Putting this course together definitely was not easy,
  497. a labour of love with the emphasis of labour.
  498. But your enthusiastic engagement
  499. has really made the effort worth it.
  500. – Now in some ways the next module is really our favourite.
  501. In module four we will explore
  502. artificial intelligence implications,
  503. which is and will only become more relevant in the future.
  504. And we’re sure that many of you are already thinking
  505. about artificial intelligence in some way,
  506. and we hope the content is interesting
  507. and really look forward to your thoughts and reactions.
  508. So we’ll see you next week.

Module 4 Artificial Intelligence and FinTech

4.0 Module 4 Introduction

  1. rt of transcript. Skip to the end.
  2. So, welcome back,
  3. we are halfway through the course now,
  4. and now you get to celebrate:
  5. So imagine that a friend calls to inform you
  6. that she has won two tickets to a concert
  7. with your favorite musician performing.
  8. The concert is this weekend
  9. and your friend invites you to use one of the tickets
  10. that she has won.
  11. Wow, what a great friend, huh?
  12. Now you are super excited
  13. and can’t wait until this weekend.
  14. As you and your friend enter the lively concert venue,
  15. you notice an impressive kiosk covered
  16. with multiple flatscreens showing video footage of
  17. your favorite musician performing.
  18. So you stop for a few minutes
  19. to watch some of the videos cycle through
  20. and now you’re really excited for the concert
  21. and head to your section ready for a great show.
  22. Like you and your friend,
  23. thousands of other concert-goers
  24. also stopped at the kiosk
  25. to watch videos in preparation for the concert.
  26. However, what neither you, your friend,
  27. nor the other concert-goers realized was that
  28. while all of you were watching videos,
  29. cameras embedded in the kiosk were also watching and
  30. taking photos of you.
  31. Your image along with most of those other fans
  32. that stopped in front of the kiosk were captured
  33. and analysed by facial recognition technology.
  34. You see,
  35. your favorite musician has a number of stalkers
  36. that have made various threats over the years,
  37. so the facial recognition analysis was a
  38. precaution to identify anyone that
  39. might be potentially dangerous.
  40. Does this seem like a scene out of a movie?
  41. Or is this a type of technological Big Brother intrusion
  42. that seems at least a few years off?
  43. This may be surprising to many,
  44. but this story is not an imaginary future,
  45. it is actually the past,
  46. and describes what occurred at a Taylor Swift concert
  47. in May 2018 as reported by the New York Times.
  48. Besides sharing what was until now our secret,
  49. undercover interest in Taylor Swift,
  50. this story raises a few important concepts
  51. worth exploring.
  52. Now we don’t claim to have all the answers,
  53. but we’ll share some of our thoughts,
  54. and we invite you to consider these questions as well.
  55. First, given the potential threat of stalkers,
  56. were the actions of setting up a covert photo-taking
  57. kiosk and using facial recognition technology
  58. reasonable?
  59. And, would your opinion change if someone was caught
  60. versus if someone wasn’t caught?
  61. And should it?
  62. Second, and more broadly,
  63. should people be informed that they are being recorded
  64. and that the images are being analysed, processed
  65. and potentially being included as part of a database?
  66. At the Taylor Swift concert,
  67. the cameras were not readily visible.
  68. But the reality for most people,
  69. especially in urban locations,
  70. is that we are really under near constant surveillance
  71. already.
  72. To use another concert example, in April 2018,
  73. a man by the name Ao went to a concert of
  74. 60,000 people in China
  75. – and unbeknownst to him,
  76. during the performance of Jackie Cheung,
  77. a Cantopop superstar,
  78. all of the people within the audience were
  79. having their faces surveilled by cameras.
  80. And right in the middle of the performance,
  81. police went down the aisle and they actually
  82. apprehended Mr. Ao and took him away.
  83. It turned out that he was a wanted criminal
  84. and during the time of the concert,
  85. as he was sitting there, unbeknownst to him,
  86. they were able to realize that he was a
  87. wanted criminal and took him to jail.
  88. In another example from China,
  89. this public surveillance was highlighted
  90. by BBC reporter John Sudworth back
  91. in December 2017.
  92. Now, it is estimated that there are at least 170 million
  93. surveillance cameras all over China
  94. and the plan is to install upwards of
  95. 400 million cameras over the next few years.
  96. So Mr. Sudworth, he visited the city of Guiyang,
  97. the capital city of the Guizhou Province of China,
  98. which is actually only a few hours from us
  99. here in Hong Kong.
  100. While in Guiyang, Mr. Sudworth participated
  101. in a little exercise,
  102. where he was tasked with avoiding detection from
  103. Guiyang’s network of cameras for as long as possible.
  104. Now Guiyang is home to about 4 million people,
  105. so it’s not a small place.
  106. How long do you think he was able to avoid detection?
  107. Well…
  108. He was discovered and detained by authorities
  109. in about 7 minutes.
  110. Below this video we provided a link
  111. so you can watch a short clip of his experience
  112. to put it into context.

Additional Readings

  • Deb, S., & Singer, N. (2018). Taylor Swift Said to Use Facial Recognition to Identify Stalkers. The New York Times. Retrieved from https://www.nytimes.com/2018/12/13/arts/music/taylor-swift-facial-recognition.html
  • Liu, J. (2017). In Your Face: China’s All-seeing State. BBC. Retrieved from https://www.bbc.com/news/av/world-asia-china-42248056/in-your-face-china-s-all-seeing-state 

4.1.1 Public Surveillance – Privacy vs. Security

  1. So we just wrapped up these very interesting stories
  2. and experiences about the use of public surveillance
  3. in identifying and capturing people
  4. in a variety of public settings
  5. including train stations and concert halls.
  6. So, let’s ask a more
  7. kind of fundamental, basic question then.
  8. What are the actions
  9. of setting up covert photo taking kiosks
  10. or relying on this wide-ranging
  11. and wide-scale facial recognition technology
  12. Is that reasonable?
  13. And if so, when?
  14. – Mm.
  15. – Dave, what do you think?
  16. – Yeah, it’s tricky for me,
  17. because on the one hand,
  18. I completely understand
  19. the kind of public security standpoint.
  20. But as someone who grew up
  21. in a very conservative place,
  22. I guess my immediate, initial thought is
  23. one of privacy.
  24. Right?
  25. – Okay.
  26. – So even if there’s one guy in the crowd
  27. who may pose a risk, to say Taylor Swift,
  28. or maybe even the community,
  29. there’s 59,999 other people
  30. that are not really posing any threat,
  31. and yet they are having their face scanned,
  32. information about their location,
  33. their preferences, the things that they like,
  34. being recorded, and the question, you know, I just,
  35. that, for some reason,
  36. doesn’t really resonate.
  37. It feels really weird to me.
  38. – So, I understand the privacy argument
  39. and I think it’s important.
  40. And I think people like to think at least that
  41. hey, I’m a person unto myself that should be respected.
  42. But what are the real costs for somebody
  43. in that audience
  44. who, let’s say, is not that criminal
  45. or not that threat to Taylor Swift
  46. or a criminal who’s being taken out of the concert hall
  47. by the police.
  48. At the end of the day, its sounds like their privacy
  49. is actually still being preserved, right?
  50. – But the thing is whether we recognise that or not,
  51. we are under constant surveillance.
  52. And again, you could say that
  53. they got the one guy, they got the one bad guy,
  54. but everyone else, you know,
  55. their privacy wasn’t really violated.
  56. What happens when they’re looking for someone
  57. based on ideology?
  58. What happens when it’s not a benevolent government
  59. that’s utilising that technology?
  60. What happens when it’s not a government at all,
  61. – Mm.
  62. – and it’s private actors
  63. that are utilising those technologies
  64. to somehow bifurcate society
  65. or to restrict rights’ mothers.
  66. I mean it really doesn’t require
  67. that much imagination to concoct a scenario
  68. where an individual,
  69. a large company, or even a government
  70. could utilise these types of technologies
  71. to single out people and potentially cause them
  72. very significant, personal injury.
  73. And maybe I’m old-fashioned,
  74. and I know that you have consumers
  75. that are actually choosing this on their own,
  76. either knowingly or unknowingly.
  77. They’re putting watches on children,
  78. that’s surveilled them everywhere they go.
  79. Obviously our phones, to a certain extent,
  80. are kind of watching
  81. where we go.
  82. And so maybe I’m being naive as a consumer,
  83. and maybe this is already occurring,
  84. but the idea of linking these things
  85. with facial technology or facial recognition software,
  86. geolocation and government police powers
  87. is something that’s actually quite disconcerting.

Additional Readings

  • Pandya, J. (2019). The Democratization of Surveillance. Forbes. Retrieved from https://www.forbes.com/sites/cognitiveworld/2019/03/02/the-democratization-of-surveillance/#2bbb1ab0177d 

4.1.2 Public Surveillance – Accountability and Cultural Lag

  1. So I think the idea of regulation
  2. is actually quite interesting
  3. and we’ve talked about it both in the context
  4. of this module,
  5. as well as previous modules
  6. but more broadly,
  7. should people be informed that they are being recorded
  8. and that their images are being analysed, processed,
  9. stored and used in other ways.
  10. Is that something that regulation
  11. should be concerned about?
  12. – Yeah, so the easy answer is yes, of course.
  13. I’m mean, I’m sure if someone’s recording you,
  14. you’re gonna wanna know it
  15. and most laws around the world,
  16. they do already have some level
  17. of notification requirement.
  18. Unless there’s say a journalistic exception
  19. within the law.
  20. But here’s the problem.
  21. So, with anything that’s ubiquitous,
  22. meaning it’s around us all the time,
  23. we become so desensitised even to warnings,
  24. that we just tend to ignore them.
  25. So think of like a streetlight or something, right?
  26. There’s so many things that are there
  27. to kind of guide us, protect us day in and day out–
  28. – I guess an example of that would be,
  29. like all these signs we see CCTV in operation
  30. – Yeah, exactly.
  31. – which we see everywhere.
  32. – Which is probably there just
  33. because of a legal requirement.
  34. – It’s a legal requirement.
  35. – To notify you.
  36. – Exactly, and so therefore, if they were
  37. to use that video recording against you
  38. or perhaps in a court of law,
  39. they would be able to say,
  40. we were authorised to do so
  41. because we met this bare level requirement.
  42. – Notification.
  43. – Exactly.
  44. If you think of the Taylor Swift example, though,
  45. very few people when they buy a ticket to go
  46. to a concert are actually gonna read through
  47. the terms and conditions of that particular event.
  48. I don’t and I’m a lawyer.
  49. I’m sure, you know, the same thing probably for you.
  50. And on a daily basis, we click I accept, I accept
  51. on so many notifications, that again
  52. the kinda ubiquity desensitises us
  53. to the fact that these are real legal notifications.
  54. So, I think we have to start thinking as society,
  55. if we’re gonna take this stuff seriously,
  56. what are not only the moral, but the legal implications
  57. in a very practical context to make sure
  58. that we’re taking these notifications seriously,
  59. and that we actually understand
  60. what rights we’re giving away.
  61. Because the reality is I think, every single day,
  62. we’re giving away pretty significant rights.
  63. – And so I think that’s really interesting.
  64. So, there’s a whole area
  65. that is somewhat regulated,
  66. and so the example would be
  67. – Right.
  68. which you just described is there’s a lot
  69. of laws talking about notification
  70. of when you’re recording somebody,
  71. be it audio, visual, whatever.
  72. But then there’s this whole other area of law
  73. that is still completely unsettled or unregulated
  74. – Yeah, yeah.
  75. Which is what we’re dealing with now
  76. in the context of AI.
  77. is, okay now that you’ve processed
  78. and analysed all this data,
  79. what legal obligation do you have
  80. if you’re the one whose processed or analysed this
  81. towards the person that you’ve actually recorded.
  82. And actually in a lot of places in the world
  83. it’s completely unsettled
  84. – Yeah.
  85. So much so that there’s actually people
  86. or companies that can use that data
  87. that they’ve analysed or processed and maybe sell
  88. on to the third parties.
  89. – Yeah.
  90. Right, and that’s purely because
  91. it is unregulated.
  92. And so that creates an interesting space to what
  93. you’re talking about
  94. is hey, if we don’t as citizens of whatever
  95. countries we’re in, or as people,
  96. as just citizens of society,
  97. if we don’t articulate the values that we want
  98. regarding privacy or security
  99. or whatever it may be,
  100. – Right.
  101. then it’ll be very difficult
  102. for us to roll back
  103. – Extremely difficult.
  104. or identify, or partition off
  105. the rights that we do wanna protect.
  106. – Right.
  107. Yeah, and this is a great example,
  108. if you remember going back to Module 1,
  109. we talked about cultural lag and the idea
  110. that it often takes time for the culture
  111. within a society to catch up to the change,
  112. very rapid change in technology, right?
  113. And, you know, thinking of within my own classroom
  114. for example,
  115. I often asked my law students
  116. raise your hand if you have a camera with you.
  117. And there’s usually kind of a few seconds
  118. of stunned silence and then immediately
  119. it dawns on them,
  120. that yes they do have a camera with them, right?
  121. – They probably have more than one.
  122. – Yeah, smartphone, right, even a smartphone alone
  123. has multiple cameras and so then again,
  124. I ask them, okay, well now raise
  125. your hand if you have two,
  126. they then realise on their laptop,
  127. on their iPad, in all these devices
  128. they actually have multiple cameras
  129. with them right there in that moment.
  130. And so, you know, if you think about that
  131. from a cultural lag perspective,
  132. these technologies change so quickly
  133. that we have them on our person at all times.
  134. Which means that we as individual citizens
  135. are also the ones that are kind of surveilling
  136. those that are around us, right.
  137. Now what do you see?
  138. You go on YouTube and you’ll see
  139. interactions of an auto-accident
  140. where normal everyday cars are filming
  141. everything that’s going on, right.
  142. You’ll see individuals getting into a fight,
  143. or an altercation,
  144. they automatically whip out their phone, right.
  145. And so it’s interesting how, again,
  146. we’re not just talking about governments here.
  147. And these technologies are expanding
  148. so that the ability, the costs,
  149. the size of the files, the stream rate
  150. all these different things are making it so that,
  151. this is really around us all the time.
  152. And again, we have to take some time
  153. to really evaluate from a cultural perspective
  154. how we expect these things to evolve,
  155. because if we don’t, then the companies
  156. through various forms of capitalism
  157. are gonna make those decisions for us.
  158. – Yeah.
  159. And those lessons are broadly relevant
  160. to artificial intelligence, but also,
  161. specifically relevant for the issues
  162. that we’ll face FinTech and financial technologies.
  163. – Absolutely. Yeah.

Additional Readings

  • Chinoy, S. (2019). We Built an ‘Unbelievable’ (but Legal) Facial Recognition Machine. New York Times. Retrieved from https://www.nytimes.com/interactive/2019/04/16/opinion/facial-recognition-new-york-city.html 
  • Barber, G. (2019). San Francisco Could Be The First To Ban Facial Recognition Tech. Wired Magazine. Retrieved from https://www.wired.com/story/san-francisco-could-be-first-ban-facial-recognition-tech/
  • Naughton, J. (2019). ‘The Goal is to Automate Us’: Welcome to the Age of Surveillance Capitalism. The Guardian. Retrieved from https://www.theguardian.com/technology/2019/jan/20/shoshana-zuboff-age-of-surveillance-capitalism-google-facebook
  • Walsh, D. (2018). How Much Is Your Private Data Worth — and Who Should Own It? Graduate School of Stanford Business. Retrieved from https://www.gsb.stanford.edu/insights/how-much-your-private-data-worth-who-should-own-it 

4.2.1 What Is Artificial Intelligence (AI)?

  1. Hopefully, you are still with us
  2. and not scrolling through Taylor Swift music videos.
  3. Because what we’re going to explore
  4. in the rest of this module is interesting
  5. and important…
  6. and Taylor Swift will still be there after we’re done,
  7. we promise.
  8. Initially, the story about Taylor Swift’s concert
  9. or the BBC reporter’s experience in China
  10. do not seem to be related to FinTech, right?
  11. So where is the connection?
  12. Well the advances in surveillance we’ve shared with you,
  13. are not just about more and better cameras,
  14. but really about the facial recognition
  15. and identity analysis software
  16. that is growing more efficient
  17. due to advances in artificial intelligence (“AI”)
  18. and other technologies that fall under the
  19. broad umbrella of AI, like machine learning.
  20. Now if that phrase is vague to you right now,
  21. don’t worry, we’re going to get to that soon.
  22. Now people have been working on facial recognition
  23. software and forms of AI for a while.
  24. In fact, a trio of early technologists,
  25. Charles Bisson, Woody Bledsoe,
  26. and Helen Chan,
  27. researched how computers could be used
  28. for facial recognition as early as the 1960s.
  29. So today’s “hot” concepts did not just pop up,
  30. but because of the increases in computing processing power,
  31. the potential of AI is starting to be realized,
  32. which has propelled AI into the public discourse,
  33. and rightfully so.
  34. So what that means is for those of us participating
  35. in this course, you and me, in our lifetimes,
  36. many of the big leaps in FinTech will be enabled
  37. because computing power that has resulted
  38. in more mature, developed AI.
  39. Thus, a major theme of the still developing FinTech story
  40. is about the increasing influence and applicability
  41. of first, machine learning, and more broadly,
  42. artificial intelligence.
  43. And this is what we want to explore in this module.
  44. So to help us get started, let’s consider a few terms,
  45. some buzzwords,
  46. so that we have the right vocabulary for our discussion.
  47. Now keep in mind, the definitions of many
  48. of these terms are not uniformly consistent yet,
  49. and even experts themselves may have slightly different
  50. approaches or views,
  51. but we went with a few definitions that we think are
  52. not just comprehensive but also comprehensible
  53. even if you’re not a technology expert.
  54. So what is artificial intelligence or AI?
  55. AI is really an umbrella term that encompasses
  56. a number of technologies,
  57. but before jumping into that,
  58. let’s start with some history.
  59. Alan Turing, the pioneering English computer scientist
  60. and mathematician,
  61. and at least one of the grandfathers of AI,
  62. first started considering AI concepts even before 1950.
  63. His eponymous Turing Test,
  64. which moved beyond the question of
  65. “Can machines think?”
  66. to the more nuanced question of
  67. “Can a machine imitate a human”
  68. is interesting.
  69. And basically, if a computer and a person were
  70. answering questions that you asked,
  71. but you didn’t know which answers were given by
  72. the human or the computer,
  73. would you be able to identify the computer
  74. from its answers alone,
  75. or could the computer trick you into
  76. thinking it was a person?
  77. And John McCarthy, long-time Stanford professor
  78. and one of the fathers of AI,
  79. who is widely credited with coining the term
  80. “artificial intelligence” expanded further.
  81. To “Uncle John” as he was referred to
  82. among many of his students,
  83. AI is the “science and engineering
  84. of making intelligent machines.”
  85. But what then is intelligence?
  86. Stephen Hawking is widely attributed with saying,
  87. “Intelligence is the ability to adapt to change.”
  88. And so the increasing capacity of machines to learn
  89. and react as new data is presented
  90. represents this process of adapting
  91. that is at the core of Hawking’s view of intelligence.
  92. Increases in computing power coupled with the
  93. creation, collection, and analysis of
  94. an ever-growing amount of data will continue to
  95. enhance the capability of artificial intelligence.

Additional Readings

  • West, D. M. (2018). What is Artificial Intelligence? Brookings Institution. Retrieved from https://www.brookings.edu/research/what-is-artificial-intelligence/
  • Turing, A. M. (1950).Computing Machinery And Intelligence. Mind, 59(236), 433–460. Retrieved from https://doi.org/10.1093/mind/LIX.236.433 
  • Sharkey, N. (2012). Alan Turing: The Experiment that Shaped Artificial Intelligence. BBC News. Retrieved from https://www.bbc.com/news/technology-18475646 
  • Torres, B. G. (2016). The True Father of Artificial Intelligence. Open Mind. Retrieved from https://www.bbvaopenmind.com/en/technology/artifficial-intelligence/the-true-father-of-artificial-intelligence/ 
  • Cameron, E. and Unger, D., (2018). Understanding the Potential of Artifical Intelligence. Strategy+Business. Retrieved from https://www.strategy-business.com/article/Understanding-the-Potential-of-Artificial-Intelligence?gko=c3fb6
  • Brundage, M., et al. (2018). The Malicious Use of Artificial Intelligence: Forecasting, Prevention, and Mitigation. Future of Humanity Institute. Retrieved from https://arxiv.org/ftp/arxiv/papers/1802/1802.07228.pdf   

4.2.2 What Is Machine Learning?

  1. Now that we’ve touched on AI,
  2. let’s move on to machine learning.
  3. Machine learning really is a subset of AI,
  4. and often when people refer to AI,
  5. they are usually talking about machine learning,
  6. especially when it comes to FinTech.
  7. For example, advances in algorithmic trading
  8. are being powered by machine learning.
  9. Additionally, the ability of financial institutions
  10. to manage risk, detect fraud,
  11. and even optimize operational processes
  12. are all being made more efficient and accurate
  13. through machine learning.
  14. Even lawyers like us, correction—former lawyers,
  15. who maybe thought we were immune
  16. from technological change are being impacted as
  17. machine learning technology is already being
  18. implemented to review documents,
  19. like contracts or loan agreements,
  20. much faster, cheaper, and even more accurately
  21. than a human could.
  22. Sounds exciting right?
  23. So let’s jump into it. What is machine learning?
  24. Now machine learning is effectively a machine,
  25. say a computer, combing through and analyzing
  26. with statistics, large amounts of data to find patterns.
  27. Now that data could be in the form of text,
  28. like in a loan document,
  29. or it could be a series of numbers, like stock prices,
  30. or a whole host of other types of information.
  31. Now based on that data, the machine,
  32. can start making predictions,
  33. as more data comes in,
  34. the predictions become more refined.
  35. Now most of us interact with machine learning
  36. almost on a daily basis
  37. — basically whenever we enjoy
  38. any kind of service that recommends things to us,
  39. you know, like the new show that Netflix
  40. is going to recommend to you tonight.
  41. Lastly, machine learning can be further specified
  42. as supervised learning,
  43. where the data is labelled or identified;
  44. unsupervised learning,
  45. where there are no such identifying markers;
  46. or reinforcement learning,
  47. which is what Google’s AlphaGo represents,
  48. and based on the machine figuring out things
  49. after exploring multiple permutations of outcomes
  50. —so basically there’s like a massive
  51. iteration process of trial and error.

Additional Readings

  • Bhatia, R. (2017). What Is Machine Learning? Forbes. Retrieved from https://www.forbes.com/sites/forbestechcouncil/2017/08/07/what-is-machine-learning/#52059a5779a7

4.2.3 What Is Deep Learning?

  1. Since we’ve mentioned machine learning,
  2. it’s important to briefly touch on something called,
  3. deep learning.
  4. Now we won’t spend much time on deep learning here,
  5. but because many advances in FinTech
  6. will be built on deep learning moving forward,
  7. it’s worth explaining, even for just a few seconds.
  8. Deep learning is basically an enhanced form of
  9. machine learning that uses algorithms
  10. that emulate the neural network of the brain
  11. —basically how our brains learn—
  12. to help the algorithm learn through
  13. a progression of layers that get “deeper” and “deeper”
  14. as more data is incorporated.
  15. So like machine learning,
  16. deep learning can be supervised, unsupervised,
  17. or reinforcement-based.
  18. If you weren’t familiar with those terms before,
  19. hopefully they make sense a little more sense now,
  20. and hopefully you also better understand
  21. the relationship amongst AI, Machine Learning,
  22. and Deep Learning.
  23. And we also hope that you’ve noticed that
  24. these forms of AI all rely on massive amounts of data,
  25. which is why our discussion of data at
  26. the beginning of this course is so important.
  27. Data truly is the fuel
  28. that will power AI-backed FinTech innovation.

Additional Readings

  • Marr, B. (2018). What Is Deep Learning AI? A Simple Guide With 8 Practical Examples. Forbes. Retrieved from https://www.forbes.com/sites/bernardmarr/2018/10/01/what-is-deep-learning-ai-a-simple-guide-with-8-practical-examples/#5b89f0508d4b 

4.3.1 AI and the Trolley Problem

  1. We’ve just discussed AI and data.
  2. Now, let’s think about that in the real world.
  3. By returning to something we discussed in module 1,
  4. where we introduced the trolley problem.
  5. If you recall, when we discussed the trolley problem
  6. we talked about two scenarios where
  7. a runaway trolley was about to hit a group of people.
  8. In one of the scenarios you had the choice to divert
  9. the trolley with a switch
  10. which would change the trolley’s direction
  11. and hit only one person who is killed by the impact.
  12. In the second scenario, instead of a switch however,
  13. you would have to push a person in front of the trolley
  14. to stop it – thus saving the group of five
  15. – but killing the person you pushed.
  16. Nearly everybody chooses to divert the trolley
  17. with the switch,
  18. and nearly all object to pushing a person into its path.
  19. Now this dichotomy highlights the important aspect of
  20. proximity in people’s decision-making,
  21. such as how proximate or close we are
  22. to a given context, or how personal it feels
  23. can alter our decisions completely.
  24. In recent years, the trolley problem has morphed into
  25. other dilemmas that have become popular
  26. in the news and in the media.
  27. This is especially true for AI and self-driving cars.
  28. With autonomous vehicles on the horizon,
  29. self-driving cars have to handle choices about accidents
  30. – like causing a small accident to prevent a larger one
  31. So, this time, for our hypothetical scenario,
  32. instead of a runaway trolley, think of a self-driving car,
  33. and instead of a switch to redirect the car,
  34. the “switch” is the self-driving car’s “programming”.
  35. So for example, imagine a self-driving car
  36. driving at high speeds, with two passengers,
  37. where suddenly three pedestrians enter into the
  38. crosswalk in front of the car.
  39. The car has no chance of stopping.
  40. Should the car hit the three pedestrians,
  41. who will likely get killed?
  42. Or crash into a concrete barrier
  43. which would lead to the two passengers likely dying?
  44. Now imagine you are the passenger of the car,
  45. what would your answer be then?
  46. And what car would you ultimately buy?
  47. A car that saves you, the passenger,
  48. at all cost in any scenario,
  49. or one that minimizes harm to all
  50. – but which ultimately may affect you?
  51. If there was no self-driving vehicle,
  52. and you were the driver,
  53. whatever happened would be understood as
  54. maybe a reaction, a panicked decision
  55. and definitely not something deliberate.
  56. However, in the case of a self-driving vehicle,
  57. if a programmer has developed a software,
  58. so the vehicle will make a certain type of decision
  59. depending on the context,
  60. then in an accident where people are harmed,
  61. is the programmer responsible for that?
  62. Is the car manufacturer responsible for that?
  63. Or, who is responsible?
  64. Is there even an answer to what a self-driving car
  65. should do?
  66. Now researchers at MIT,
  67. the Massachusetts Institute of Technology,
  68. further revived this moral quandary back in 2014.
  69. They created a website they called the Moral Machine,
  70. and through that website, respondents around the world
  71. were asked to decide in various self-driving vehicle scenarios,
  72. such as whether to kill an old man or an old woman,
  73. an old woman or a young girl,
  74. the car passenger or pedestrians,
  75. and many other similar questions.
  76. Since its launch the experiment has generated
  77. millions of decisions,
  78. and analysis of the data was presented in a paper
  79. in the scientific journal Nature in 2018.
  80. The study sparked a lot of debate about
  81. ethics in technology,
  82. which is the purpose of this course.
  83. So given that, we’d like to ask you a few questions.
  84. One, who should you trust?
  85. Should we trust AI?
  86. Or should we trust humans?
  87. Two, who’s responsible if something bad happens?
  88. So in the context of an autonomous driven vehicle,
  89. is the car manufacturer responsible?
  90. Is the software programmer responsible?
  91. Or another stakeholder?
  92. And third, culture.
  93. What is the role of culture in all of this?
  94. Let’s consider these questions together.

Notes on AI and the Trolley Problem:

It is important to note that the trolley problem is fundamentally about showing how we process information and to highlight blind spots in our decision-making. Doing so hopefully helps us improve our choices by demonstrating the need of our morality and our sense of responsibility to humanity in our decision-making. And to the extent we think that morality, emotion, and humanity are important and worth developing, you could say that by linking AI and driverless cars to the trolley problem, we may be doing the opposite of what was intended and missing the point altogether, possibly to our mutual disadvantage. We should be wary that we are making the whole conversation less proximate.

Additional Readings

  • Awad, E., Dsouza, S., Kim, R., Schulz, J., Henrich, J., Shariff, A., Bonnefon, J. F., & Rahwan, I. (2018). The Moral Machine Experiment. Nature, 563, 59–64. Retrieved from https://www.nature.com/articles/s41586-018-0637-6 (paywall)
  • Huang, E. (2018). The East and West Have Very Different Ideas On Who To Save In A Self-Driving Car Accident. Quartz. Retrieved from https://qz.com/1447109/how-east-and-west-differ-on-whom-a-self-driving-car-should-save/ 
  • Hao, K. (2019). Giving Algorithms a Sense of Uncertainty Could Make Them More Ethical. MIT Technology Review. Retrieved from https://www.technologyreview.com/s/612764/giving-algorithms-a-sense-of-uncertainty-could-make-them-more-ethical/ 

4.3.2 AI and the Trolley Problem: Trust and Proximity

  1. So, let’s talk about trust.
  2. Now, Dave, you’ve been a passenger
  3. in a car that I’ve driven before.
  4. – I have.
  5. – So, who do you trust more,
  6. me or the autonomous-driven vehicle.
  7. Well,
  8. as much as–
  9. – Tough question, I know.
  10. – No, so as good of a driver as you are,
  11. the reality is, well I don’t know.
  12. I guess this is the thing that is
  13. so disconcerting for a lot of people.
  14. I think when I’m in your car, I trust you.
  15. If I was the one that was driving,
  16. I would certainly trust myself, right?
  17. But I think for a lot of people, we just have a question
  18. of this completely autonomous, non-human actor,
  19. and not singular, either, like potentially thousands
  20. of these non-human actors that are gonna be out there
  21. with these large vehicles roaming around.
  22. The reality is, I think, that I would probably want
  23. to trust you more because you’re my friend,
  24. and I know you,
  25. but I think empirically, I believe that it is probably
  26. a lot safer with a host of autonomous vehicles
  27. that are out there.
  28. – Yeah, I think you’re right.
  29. I think a lot of the research as we have it now
  30. demonstrates and leads to the fact that overall,
  31. things will probably be safer
  32. as more and more autonomous vehicles
  33. are on the road.
  34. But why do so many people,
  35. or why do you think so many people are
  36. so resistant to that?
  37. – Well, I mean, I think I’m going to flip it
  38. around and ask you, right,
  39. ’cause this trust is a big part of what we’re talking about,
  40. and the fundamental, I guess, foundation for so much
  41. of this ethics conversation.
  42. And, do we over-trust ourselves?
  43. Do we under-trust technology,
  44. or is it the other way around?
  45. Like, are we rushing so quickly into these technologies
  46. without really understanding whether or not
  47. we should be trusting them.
  48. – Yes, so I think two things off the top of my head.
  49. One is, you know, as humans,
  50. we tend not to trust things that we don’t understand.
  51. – Right.
  52. – Right.
  53. And so, I think that plays a lot into it of
  54. hey, I don’t understand how this exactly works,
  55. so I’m gonna distance myself from this,
  56. or I’m gonna be suspicious of it
  57. until I do understand how it works.
  58. I think there’s a lot of that.
  59. Two, I think this idea of over-confidence, right.
  60. As again, as humans, we tend to be more over-confident
  61. of our own ability than we probably should be.
  62. There’s been tons of tests that have been done
  63. where that’s been shown, right?
  64. And I think the combination of those two things
  65. of hey, I’m actually not that bad of a driver anyway,
  66. so it should be okay, plus I don’t understand
  67. what’s going on in this car with no driver.
  68. Those two things kind of collide, I think,
  69. amongst all, amongst humanity or mankind
  70. to kind of create this situation where hey,
  71. maybe I’m resistant to this change.
  72. – Yeah, so from the perspective of trust,
  73. I think you kind of hit on what we discussed earlier
  74. from a cultural-lag perspective, right?
  75. For a lot of people, they’re gonna be very comfortable
  76. kind of continuing in that perceived
  77. safe method of travel,
  78. when in actual fact,
  79. the numbers maybe don’t bear that out,
  80. and they’d be willing to persist with a situation
  81. where they’re the driver rather than going
  82. into a potentially safer autonomous vehicle.
  83. And I think this gets to another interesting point
  84. that is often a criticism of the trolley problem
  85. in the first place is that it presents
  86. this binary, almost illogical situation
  87. where you have to choose between one person dying
  88. or five people dying or some really fantastic situation
  89. when that’s probably not the case at all.
  90. – Hey, it’s not reflective of real life.
  91. – Sure, they’re not reflective of real life,
  92. and so, I guess my question for you,
  93. from an autonomous-driving-vehicle perspective
  94. and AI perspective, I think one of the real reasons
  95. why people in this industry are saying you should trust
  96. autonomous vehicles more is because
  97. they can communicate
  98. with each other kind of seamlessly and simultaneously.
  99. What’s your perspective on that?
  100. I mean is that kind of how it would work,
  101. and how would that potentially
  102. make things better, safer, smoother?
  103. – So I think that’s a really cool question
  104. for at least, again, two reasons off the top of my head.
  105. I think historically when you look at the most
  106. earliest versions of this kind of autonomous driving,
  107. there’s an idea that actually these vehicles
  108. would not be independent.
  109. They would somehow be in sync with each other
  110. to make driving much more efficient,
  111. so I think the more advanced forms
  112. of this autonomous driving will be
  113. exactly what you are talking about,
  114. these kind of a linked network of vehicles
  115. that collectively will be able to gauge risk,
  116. and overall, holistically make things maybe more safer.
  117. So I think there’s definitely that component that exists.
  118. I think the second thing that ties
  119. into what you’re talking about
  120. with respect to this binary
  121. kind of, this false dichotomy, right?
  122. – Yep.
  123. – It’s like a binary code.
  124. It’s either zero or there’s a one.
  125. – Yeah, there’s lots of information, it’s not just either or.
  126. – Exactly, and if you talk to people
  127. who operate in this space, either an auto manufacturer
  128. who’s trying to go into autonomous vehicles,
  129. or on the software side,
  130. people who are developing the software,
  131. you know, almost uniformly, they will tell you
  132. that it’s never binary.
  133. – Yeah.
  134. – It’s always multiple, different outcomes
  135. and things that can happen,
  136. and you know, kind of based on what we discussed
  137. just a few minutes before about AI
  138. and machine learning and deep learning,
  139. and this idea that these systems will go
  140. through multiple permutations on all,
  141. based on the data that’s being inputted,
  142. and historical data as well as data
  143. that’s coming in live.
  144. Then, they look at what the different outcomes will be,
  145. and so what that tells us is
  146. that there will probably be multiple outcomes anyway,
  147. which is more reflective of reality,
  148. which gets to the criticism
  149. that a lot of people have about trolley.

Additional Readings

  • Sage, A., Bellon, T., & Carey, N. (2018). Self-driving car industry confronts trust issues after Uber crash. Reuters. Retrieved from https://www.reuters.com/article/us-autos-selfdriving-uber-trust/self-driving-car-industry-confronts-trust-issues-after-uber-crash-idUSKBN1GY15F 
  • Kaur, K., & Rampersad, G. (2018). Trust in Driverless Cars: Investigating Key Factors Influencing the Adoption of Driverless Cars. Journal of Engineering and Technology Management, 48, 87-96. Retrieved from https://doi.org/10.1016/j.jengtecman.2018.04.006 
  • Verger, R. (2019). What will it take for humans to trust self-driving cars? Popular Science. Retrieved from https://www.popsci.com/humans-trust-self-driving-cars 
  • Baram, M. (2018). Why the Trolley Dilemma for Safe Self-Driving Cars is Flawed. FastCompany. Retrieved from https://www.fastcompany.com/90308968/why-the-trolley-dilemma-is-a-terrible-model-for-trying-to-make-self-driving-cars-safer 

4.3.3 AI and the Trolley Problem: Cultural Lag

  1. I’m curious to hear what you think
  2. in terms of a lot of, again thinking of cultural lag
  3. and thinking of how we could implement these things.
  4. There are certain jurisdictions that are way out front
  5. in terms of trying to establish the physical landscape
  6. that would allow these systems to take place.
  7. So, one of the more famous ones
  8. would be certain parts of Arizona for example
  9. in the United States, right?
  10. And they’re trying to make sure
  11. the physical infrastructure is there
  12. to kind of speed through that cultural lag
  13. but it is also interesting the juxtaposition
  14. of a lot of anytime an autonomous vehicle
  15. hits someone,
  16. or hits anything, it’s global news, right?
  17. So what is that juxtaposition and why is it news
  18. just because there’s an accident?
  19. – One, I think the nature of media now
  20. is that people want to see headlines, right?
  21. And so I think there’s something exotic still about,
  22. “Hey, AI, even though or autonomous driven vehicle,
  23. even though statistically you’re probably still safer,
  24. even now, than the average driver.”
  25. – Yeah, ’cause they’ll be quick to point out,
  26. these cars have been driving around for
  27. thousands of hours–
  28. – Miles, right.
  29. – Yeah, thousands of miles.
  30. – Kilometres, yeah.
  31. – Yeah, exactly, yeah, yeah.
  32. – And we don’t report on every accident that happens.
  33. – Right, or non-accident.
  34. – Or non-accident that happens.
  35. Yet if a vehicle that’s been driven so long,
  36. but has one accident
  37. just because it’s autonomously driven,
  38. now it’s an issue.
  39. So, I think there’s a bit of a media frenzy
  40. around autonomous driven vehicles,
  41. partly because it’s a bit sexy right now
  42. and partly because people are not sure
  43. what it is and what’s gonna happen.
  44. I do think the Arizona example is interesting
  45. because definitely there are pockets
  46. of geographies in different places in the world
  47. so in the United States, you mentioned Arizona
  48. but if you go to Silicon Valley now,
  49. you see Google Waymo vans everywhere, right?
  50. – Yeah, on the Google campus, yeah, yeah, yeah.
  51. – And so, there’s a bit of that.
  52. I think outside of the United States,
  53. one place that’s really interesting
  54. or been at the forefront of this is Japan,
  55. which has instituted at the national level,
  56. series of legislation to allow
  57. autonomous driven vehicles
  58. and even trucks in the next few years.
  59. – Yeah.
  60. And so they’re quickly trying to build up
  61. the technological infrastructure as well
  62. as the physical infrastructure to allow
  63. these kind of vehicles to operate more effectively
  64. and efficiently.
  65. – Yeah.
  66. – And so I think that’s an important piece
  67. and I think once you have kind of national and local
  68. leaders behind it then the regulatory landscape
  69. will change pretty rapidly around that
  70. and once that happens and insurance will change,
  71. kind of ideas of liability would change
  72. and those kind of process will start developing.
  73. – So let’s talk about that for a minute.
  74. So again, for all of you out there.
  75. Just think for a moment.
  76. Let’s say that, talking about, again,
  77. the changing technology and then how culture and
  78. therefore laws and things have to change and
  79. catch up to it.
  80. We’re talking about regulation.
  81. Who would be responsible if there was an accident?
  82. So think about that for a moment.
  83. Let’s say you’re walking down the street.
  84. You’re crossing the street and all the sudden,
  85. an autonomous Uber or delivery vehicle.
  86. Maybe a Google bus or something
  87. cuts you off and ends up knocking you down
  88. causing some injury.
  89. Think about it for a minute.
  90. Like who would or should be responsible for that?
  91. – Well, maybe what you said as a starting point
  92. to think about now if it was just a
  93. normally driven vehicle without the
  94. autonomous power software driving the vehicle.
  95. Who would be responsible?
  96. And we will kind of go through very typical
  97. kind of legal analysis.
  98. Insurance people would be involved.
  99. Police officer will show up and do a police report
  100. and there would probably attribute some negligence
  101. to the driver or to maybe if you were jaywalking.
  102. – Yeah, different people.
  103. – And there would be different people.
  104. I think that would be the initial starting point
  105. for similar just because a vehicle is involved
  106. in an accident that has no driver.
  107. – Yeah.
  108. It doesn’t change the entire dynamic.
  109. – It doesn’t change that entire dynamic.
  110. Exactly.
  111. Now, I guess that goes to a more fundamental question
  112. though about, let’s say there’s inherently wrong
  113. with the software or with the vehicle itself,
  114. the autonomous driven vehicle.
  115. Then who would be responsible?
  116. Would it be the software programmer?
  117. The developer who created the AI software
  118. or his company?
  119. Or would it be the car manufacturer
  120. that actually owns or manufacture the vehicle?
  121. Or would it be the owner of the vehicle
  122. who’s not even driving.
  123. – Yeah.
  124. – But they actually own maybe a fleet of these vehicles.
  125. – It does make it difficult though.
  126. I mean although it doesn’t change things entirely,
  127. there is one big missing component there.
  128. It’s the driver.
  129. Right, so currently, under tort law,
  130. almost everywhere in the world,
  131. if a car strikes someone then the driver
  132. is almost universally gonna be responsible
  133. so it certainly does limit the number of people
  134. that could potentially be responsible.
  135. – Yeah, so I think you’re right.
  136. Overwhelmingly, if a driver has an accident
  137. with the pedestrian, almost in most situations,
  138. the driver is gonna be held responsible for that.
  139. I think the proxy for that moving forward,
  140. autonomous vehicles would be who owns the vehicle?
  141. Now, the thing that would be really interesting I think
  142. is the next iteration.
  143. Now, there’s a next, the next version as it advances,
  144. the idea of owning a vehicle
  145. is vastly different from before.
  146. – Yeah, yeah.
  147. – It maybe owned collectively by a neighbourhood or by–
  148. – Or it could be a utility like electricity.
  149. – Exactly or it could be utility or just by a company
  150. like that has a fleet of taxis
  151. but similarly, we’ll just have a few.
  152. And so depending on how these assets then are owned,
  153. the idea of ownership will also become very interesting
  154. and how you hold those people accountable.
  155. – That is why I hold a little bit of concern
  156. in this regard because typically, the bigger
  157. the actor is, the more challenging it is
  158. as an individual, someone who’s injured,
  159. the more challenging it becomes for you to seek,
  160. redress and to recover any type of damages from that.
  161. So for example, if it’s you versus Uber,
  162. that’s a significantly–
  163. – Power dynamic is very skewed.
  164. – Very different power dynamic
  165. than if it was me versus you, let’s say, right?

Additional Readings

  • Ryall, J. (2019). Japan edges closer towards brave new world of self-driving cars but hard questions remain. South China Morning Post. Retrieved from https://www.scmp.com/news/asia/east-asia/article/2180828/japan-edges-closer-towards-brave-new-world-self-driving-cars 
  • Bogost, I. (2018). Who Is Liable for a Death Caused by a Self-Driving Car? The Atlantic. Retrieved from https://www.theatlantic.com/technology/archive/2018/03/can-you-sue-a-robocar/556007/ 

4.3.4 AI and the Trolley Problem: Cultural Differences and Biases

  1. Okay, so we’ve covered trust and responsibility
  2. and how the challenge of getting these things going
  3. from a cultural lag perspective.
  4. But I thought one of the most interesting things
  5. to come out of this,
  6. especially from the MIT study in particular,
  7. was the way various elements
  8. of culture and perhaps bias, kinda came out
  9. and the potential programming implications
  10. from an AI perspective.
  11. Can you talk about that for a minute?
  12. – Yeah, so I think that’s what got picked up by
  13. by media the most.
  14. – Everybody was talking about
  15. – Everybody was talking about the cultural implications
  16. of what this Moral Machine,
  17. the data that came out of these basically
  18. surveys that people were doing.
  19. Effectively how different cultures,
  20. or at least the way it was painted,
  21. was how different cultures
  22. prioritise life in a sense.
  23. If that life was in a car with you,
  24. it could be more valuable in a certain
  25. cultural context than the life
  26. that’s outside the car, that potentially you’re hitting.
  27. And so how do you try to protect one life over the other?
  28. And so, you know, that’s a simplistic explanation
  29. but you know, there was a lot of very
  30. interesting takeaways that people had.
  31. They found that Chinese respondents
  32. were more likely to choose hitting pedestrians
  33. on the street,
  34. instead of putting car’s passengers in danger.
  35. And we’re more likely to spare the old over the young.
  36. Western countries or people from western countries
  37. tended to prefer inaction,
  38. letting the car continue its path.
  39. So, kind of like inertia.
  40. While Latin Americans preferred to save the young.
  41. – Okay, so I wasn’t that surprised
  42. when I saw the results from the MIT study
  43. and it showed that Asians for example,
  44. were more likely to preserve the life of elderly
  45. at the expense of the young, as an example.
  46. I think, having been in Asia
  47. for the past 20+ years,
  48. many cultures here have a reverence for the elderly.
  49. – Deference
  50. – Deference at least, yeah.
  51. And so, I think there were certain things
  52. like that, that maybe weren’t that surprised,
  53. and fit certain cultural stereotypes
  54. that have, I think been around for a long time.
  55. But I guess that the bigger question is
  56. not that these cultural preferences existed,
  57. but then what should we do with them as a result.
  58. Especially when programming, FinTech,
  59. and things in the future, right.
  60. I think, you and I both know that
  61. from one of the challenges that
  62. lawmakers, or ethicists like us,
  63. or companies as they’re trying to create
  64. a moral code for their employees.
  65. One of the challenges that they have
  66. is trying to create a moral code
  67. that permeates culture and
  68. goes across country lines, right.
  69. So, is it possible, or should it even be a goal
  70. from an AI and technology standpoint
  71. for us to create a uniform sense of morality?
  72. – Yeah, that’s a really important question to be honest.
  73. So I think if we take one step
  74. before we get even to the technology.
  75. I think the example you gave of, let’s say
  76. you have a Western company that does business
  77. all over the world
  78. – Yeah.
  79. Asia, Africa, the Middle East–
  80. – They write a code of conduct in California.
  81. – Yeah–
  82. – So then they have to apply it everywhere.
  83. – And now they want to make it universal.
  84. – Yeah, yeah.
  85. – But potentially what is right
  86. in their initial cultural context
  87. may actually be questionable
  88. in a different cultural context.
  89. – Or, still perhaps “right”
  90. from a legal or moral perceptive,
  91. but communicated in a way that doesn’t
  92. resonate with local people.
  93. – Sure.
  94. And so, there are a lot of implementation challenges
  95. to say the least,
  96. when companies try to embark on this kind of initiative.
  97. So, if we transport that into AI
  98. and just technology in general
  99. you know, what comes to mind here is,
  100. automobiles or the automobile industry
  101. is a global industry, right.
  102. We have car manufacturers in China,
  103. in Japan, in Korea, in the United States,
  104. and a whole host of other places.
  105. And so, the programmer who sits somewhere in Asia
  106. with a certain cultural context
  107. that is programming a particular type of AI
  108. into a vehicle.
  109. With some of the results potentially from the MIT study
  110. and let’s say that vehicle is then imported
  111. or shipped to the United States,
  112. and that particular cultural context
  113. bleeds into how that vehicle operates in
  114. a different cultural context.
  115. – Right.
  116. – And then how does that vehicle
  117. have with its particular culture, influence,
  118. on the road with other vehicles
  119. that have a different cultural influence.
  120. How do those all interact?
  121. I think that’s a really fascinating
  122. and important question.
  123. It’s a microcosm of a greater host of challenges
  124. that AI will bring to the forefront
  125. the type of things that we need to discuss as a society.
  126. – Yeah, and so, again, foreshadowing a little bit,
  127. but also, revisiting the very first module.
  128. Really, while these are interesting practical challenges
  129. that all of us have to consider
  130. as we enter into this kind of new wave
  131. of the Fourth Industrial Revolution.
  132. We haven’t even touched upon
  133. the most critical of these issues.
  134. The idea that one of the most common forms of work
  135. globally would be drivers.
  136. – Drivers.
  137. Certainly within the US and other places
  138. within China, et cetera.
  139. There are so many millions, and millions,
  140. and millions of drivers around the world,
  141. and so this kinda leads into more fundamental,
  142. systemic, social questions of
  143. if we remove these from the equation
  144. how do we then reintegrate them
  145. into the workforce.
  146. How do we ensure that society
  147. is able to absorb those people,
  148. provide them not only jobs,
  149. but a sense of well-being.
  150. And that’s something that we’re gonna be considering
  151. in the next few modules.

Additional Readings

  • Hao, K. (2018). Should a self-driving car kill the baby or the grandma? Depends on where you’re from. MIT Technology Review. Retrieved from https://www.technologyreview.com/s/612341/a-global-ethics-study-aims-to-help-ai-solve-the-self-driving-trolley-problem/ 
  • Maxmen, A. (2018). Self-Driving Car Dilemmas Reveal That Moral Choices Are Not Universal. Nature. Retrieved from https://www.nature.com/articles/d41586-018-07135-0

4.4.1 Data and Models

  1. Since data is so critical to AI
  2. as well as to many of the other technologies that
  3. underpin FinTech,
  4. it is important that not only the right data is being used
  5. but also ensuring such data is not biased.
  6. The phrase “garbage in, garbage out”
  7. has probably never been more apt, nor as important,
  8. than when describing AI.
  9. And bias can find its way into AI in a few ways.
  10. Let’s take a simple example.
  11. If a computer’s model is using data that is
  12. already contaminated by some level of discrimination
  13. then the output will also inevitably be prejudiced.
  14. So say for instance,
  15. your AI relies on data from apartheid-era South Africa,
  16. well, chances are that data incorporates the wide-spread
  17. racist policies that existed at that time.
  18. Obviously, this would lead to less than ideal outcomes.
  19. And even assuming your data is free of
  20. such explicit bias,
  21. there are other ways for bias to possibly creep in
  22. to artificial intelligence.
  23. For example, cultural bias and norms
  24. can inadvertently be programmed into AI
  25. because a programmer from one culture
  26. might value some characteristic differently
  27. than a programmer from another part of the world.
  28. We’ll explore this a bit further when we revisit
  29. the trolley problem.
  30. There are other potential issues that
  31. also relate to bias.
  32. AI is driven by algorithms and models.
  33. In her thought-provoking book,
  34. Weapons of Math Destruction,
  35. or what Harvard-trained mathematician,
  36. Cathy O’Neil refers to as “WMDs”,
  37. she identifies three characteristics
  38. of a possibly dangerous model.
  39. So the first characteristic of a dangerous model,
  40. is that the model is opaque and not transparent.
  41. So this would be if the system is what we call,
  42. a blackbox,
  43. And it’s difficult for those from the outside
  44. to be able to really understand
  45. what is going on behind the scenes.
  46. The second is that,
  47. the model is potentially scalable
  48. and can be used broadly or across large populations.
  49. Now of course, this has been a key component
  50. to what we’ve talked about thus far.
  51. The issue with a lot of these AI
  52. and other forms of technology
  53. is that they can scale beyond
  54. anything that we’ve seen before.
  55. And the third aspect is that,
  56. the model would potentially be unfair
  57. in a way that would negatively impact
  58. or even destroy people’s lives.
  59. So for example,
  60. if AI was being used from a FinTech context,
  61. to determine who could get a mortgage to
  62. purchase a home, who has access to credit, etc., etc.,
  63. These would be things that could have a significant
  64. negative impact if someone was not granted
  65. access to them.
  66. So despite all the good that will certainly accompany
  67. the rise of AI,
  68. it’s also pretty clear that
  69. biased data in conjunction
  70. with possibly suspect models
  71. have the potential to create more risk,
  72. unfairness, and inequality,
  73. which is why it’s important
  74. to be aware of their impact
  75. and invest time thinking about how to prevent
  76. such problems, now,
  77. before the technology is fully mature
  78. and really permeates our lives.
  79. So in the next few cases,
  80. we’re going to look at some of these warning signs
  81. in real life scenarios.

Additional Readings

  • O’Neil, C. (2017). Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. New York: Broadway Books.
  • Elish, M. C., & Boyd, D. (2018). Don’t Believe Every AI You See. The Ethical Machine. Retrieved from https://ai.shorensteincenter.org/ideas/2018/11/12/dont-believe-every-ai-you-see-1
  • Floridi, L. (2016). Should We Be Afraid of AI? Aeon. Retrieved from https://aeon.co/essays/true-ai-is-both-logically-possible-and-utterly-implausible 
  • Christakis, N. A. (2019). How AI Will Rewire Us. The Atlantic. Retrieved from https://www.theatlantic.com/magazine/archive/2019/04/robots-human-relationships/583204/   
  • West, D. M., & Allen, R. J. (2018). How Artificial Intelligence Is Transforming the World. Brookings Institute. Retrieved from https://www.brookings.edu/research/how-artificial-intelligence-is-transforming-the-world/ 

4.4.2 Mortgage Application

  1. In the last section we explored AI,
  2. particularly in relation to autonomous vehicles,
  3. and considered really important topics around trust,
  4. accountability and the impact of culture.
  5. Next, we will look into AI bias,
  6. specifically in the context of
  7. assisting human decision-making.
  8. Often, when we think of AI or algorithms,
  9. we think of something impartial and neutral,
  10. something that is simply acting based on pure facts.
  11. And this is one of the reasons why we humans
  12. have began using AI to help us with
  13. more subjective evaluations and decisions.
  14. If we can remove human error from decision-making,
  15. that would lead to a more just and better world,
  16. right?
  17. But the reality tends to be that algorithms are
  18. not as neutral as many have come to hope.
  19. This is because of bias that gets programmed in,
  20. because of cultural bias from the programmer,
  21. or from historical bias,
  22. that is somehow prejudiced in a certain way.
  23. Google AI chief, John Giannandrea,
  24. has said that his main concern regarding AI
  25. does not revolve around killer AI robots
  26. or Terminator sorts of things,
  27. but instead he is worried about the biases
  28. that he says, “that may be hidden inside algorithms
  29. used to make millions of decisions every minute”.
  30. So, first of all, what do we actually mean
  31. when we say that AI or an algorithm is biased?
  32. If you recall our talk about machine learning,
  33. a vital part of that revolves around the training of AI.
  34. Training to see and follow patterns by feeding it
  35. large amounts of information and data,
  36. training it to understand what success looks like,
  37. fine-tuning the results and reiterating,
  38. and so forth.
  39. And in this process there is the possibility of
  40. human errors and prejudice integrating itself
  41. into the algorithm,
  42. Let’s take a look at another example.
  43. In the past, if you were about to buy a home,
  44. you would typically meet in person with a
  45. mortgage officer at your local bank, probably.
  46. You would visit their workplace,
  47. have a chat, provide any relevant documentation,
  48. this person would then review your documentation
  49. and later they give you a result of whether the bank
  50. was going to lend you money or not.
  51. For the lending officer,
  52. this would typically be a fairly subjective exercise.
  53. Because the majority of home loan applicants
  54. fall in some level of a grey area
  55. where there’s no definitive “yes or no” with respect
  56. to loans so they have some discretion.
  57. So, with the recent advent of more advanced algorithms
  58. and to increase efficiency,
  59. this process has been simplified for many banks,
  60. where the decision-making is now,
  61. to some extent, outsourced to AI,
  62. which makes the recommended loan application decision.
  63. By doing so,
  64. this process should be more accurate,
  65. objective and fair, right?
  66. Well, not always,
  67. Amongst many studies that have been done,
  68. in particular,
  69. a recent study by the University of California
  70. found strong bias and discrimination
  71. by these “AI lenders”,
  72. such as charging 11 to 17 percent higher interest rates
  73. to African American and Latino borrowers.
  74. Additionally, minority applicants
  75. are more likely to be rejected than white applicants
  76. with a similar credit profile.
  77. Now, lending discrimination is not something new,
  78. and has been reported on a lot in the past.
  79. So Washington Post, for another US example,
  80. uncovered widespread lending discrimination
  81. back in 1993, where they showed how various
  82. home lending policies were negatively impacting
  83. minority residents.
  84. What further complicates the problem around AI bias,
  85. is what we people refer to as blackbox algorithms.
  86. This is something similar to what we discussed earlier
  87. about opaque models, lacking transparency.
  88. And really, private companies are generally hesitant
  89. to open the door for other people
  90. to scrutinize what they’ve been doing.
  91. So how do we make an inclusive algorithm,
  92. when the data, its developers and the organizations
  93. who hire them are seemingly not diverse and inclusive?
  94. Overall, while algorithms are helpful,
  95. they may not make things as fair as we ideally
  96. would have hoped for.
  97. And we therefore have to be careful
  98. in blindly applying them
  99. – especially since they have a tendency to
  100. repeat past practices,
  101. repeat patterns,
  102. and automate the status quo.

Additional Readings

  • Knight. W. (2017). Forget Killer Robots – Bias Is the Real AI Danger. MIT Technology Review. Retrieved from https://www.technologyreview.com/s/608986/forget-killer-robotsbias-is-the-real-ai-danger/ 
  • West, S.M., Whittaker, M. and Crawford, K. (2019). Discriminating Systems: Gender, Race and Power in AI. AI Now Institute. Retrieved from https://ainowinstitute.org/discriminatingsystems.html 
  • Brenner, J. G., & Spayd, L. (1993). A Pattern of Bias in Mortgage Loans. The Washington Post. Retrieved from https://www.washingtonpost.com/archive/politics/1993/06/06/a-pattern-of-bias-in-mortgage-loans/d04bcb29-d97b-44b5-b4e0-93db269f8f84/
  • Counts, L. (2018). Minority Homebuyers Face Widespread Statistical Lending Discrimination, Study Finds. Berkeley Haas. Retrieved from http://newsroom.haas.berkeley.edu/minority-homebuyers-face-widespread-statistical-lending-discrimination-study-finds/
  • Hao, K. (2018). Can you make an AI that isn’t ableist? MIT Technology Review. Retrieved from https://www.technologyreview.com/s/612489/can-you-make-an-ai-that-isnt-ableist/ 

4.4.3 Mortgage Application – Trust

  1. So let’s think about a question.
  2. Imagine that you went to a bank
  3. and you applied for a financial product,
  4. like a loan for a home.
  5. And you submitted all the paperwork,
  6. it was processed by the bank
  7. and a few days later you were rejected.
  8. And you went back to the loan officer and asked why.
  9. And they said hey,
  10. our AI decision-making software screened
  11. and scanned your application
  12. and said unfortunately, no.
  13. What would you do?
  14. Dave, what would you do?
  15. – It’s tricky, right?
  16. I mean, it’s already hard enough
  17. to communicate with banks as it is
  18. and now they’re moving it into this
  19. completely amoral space
  20. where, essentially this software
  21. is gonna be making a decision.
  22. And I mean, I’m not really sure
  23. you would have a recourse, would you?
  24. Like, they’re not gonna give you access to the algorithm,
  25. they’re not gonna show you probably exactly why
  26. and it just seems like it would be one step further away
  27. from, kind of, a balanced negotiation
  28. between you and the service provider, right?
  29. – Yeah and so I think, I think that’s a good point
  30. and the idea of not having recourse is really key.
  31. Because I think it raises really fundamental questions
  32. about what’s fair,
  33. right, because if that discrimination,
  34. well, if let’s say your rejection by the bank
  35. was based on some level of latent discrimination
  36. based on biassed data
  37. or other forms of bias that may exist
  38. in that AI process then there’s some issues of fairness
  39. if you can’t go rectify that.
  40. – Yeah, well why is it so critical that banks,
  41. or really even we more broadly,
  42. think of these questions now?
  43. Especially like something like bias.
  44. I mean isn’t that something that should come out
  45. later on?
  46. – I think what we’re finding already is that
  47. the longer we wait the more difficult it will be
  48. to implement, kind of, cleaner AI that has,
  49. you know, more cleaner or filtered data.
  50. And partly because data compounds.
  51. You know, there is troves of data being
  52. produced every day
  53. and if we’re not aware of the influence
  54. of how that’s compounding and the negative inputs
  55. that are already there
  56. then that potentially becomes a problem.
  57. Particularly since a lot of that data’s already based
  58. on historical data
  59. we know incorporates bias that existed
  60. because, you know, societal norms were different
  61. in the 1960s or 1970s versus what they are today.
  62. – Well can’t they just clean it up?
  63. Like can’t we just make it neutral somehow?
  64. – Well I think in certain cases
  65. you could perhaps be able to do that.
  66. But in a lot of the situations what that would do is,
  67. in the process of cleaning that up,
  68. perhaps other factors of the data,
  69. that you may need to rely on, also get influenced.
  70. So that data’s also not clear anymore.
  71. And so this then become a little bit of a catch-22.
  72. Fixing one problem creates another problem.
  73. – Yeah, okay, so where do you fall on this line?
  74. So let’s say we know that humans are very imperfect
  75. and we know that most of the bias
  76. that we have in the data is there
  77. because humans are biased
  78. and they will discriminate for race
  79. or gender or nationality
  80. and for a whole host of other reasons.
  81. And so, on the one hand,
  82. we clearly do not have a perfect track record.
  83. But on the flipside we are now entering into,
  84. potentially entering into this area
  85. of complete amorality
  86. that’s going to be built on the back
  87. of existing historical data and could introduce
  88. a whole new set of bias, or even worse,
  89. entrench existing bias
  90. into these decision-making processes.
  91. So what would you trust more?
  92. Do you trust the kind of human bias
  93. that’s inherent in the existing systems
  94. or do you trust the potential bias in these data sets
  95. and AI that’s gonna be coming,
  96. you know, in the next few decades?

Additional Readings

  • Kleinman, Z. (2018). IBM Launches Tool Aimed at Detecting AI Bias. BBC News. Retrieved from https://www.bbc.com/news/technology-45561955 
  • Golden, J. (2019). AI Has a Bias Problem. This Is How We Can Solve It. WEF. Retrieved from https://www.weforum.org/agenda/2019/01/to-eliminate-human-bias-from-ai-we-need-to-rethink-our-approach/ 
  • Greene, T. (2018). Human Bias is a Huge Problem for AI. Here’s How We’re Going to Fix It. The Next Web. Retrieved from https://thenextweb.com/artificial-intelligence/2018/04/10/human-bias-huge-problem-ai-heres-going-fix/ 
  • Luckerson, V. (2019). The Ethical Dilemma Facing Silicon Valley’s Next Generation. The Ringer. Retrieved from https://www.theringer.com/tech/2019/2/6/18212421/stanford-students-tech-backlash-silicon-valley-next-generation 
  • Rao, A., & Cameron, E. (2018). The Future of Artificial Intelligence Depends on Trust. Strategy+Business. Retrieved from https://www.strategy-business.com/article/The-Future-of-Artificial-Intelligence-Depends-on-Trust 

4.4.4 Mortgage Application – Accountability

  1. Well, I don’t know, Dave, that’s a difficult question.
  2. I think maybe if we go back to just a basic framework
  3. that we talked about earlier,
  4. about models that potentially are dangerous.
  5. And we talked about is the model opaque
  6. and not transparent?
  7. Does it have the possibility to scale
  8. and potentially be used by large amounts of people?
  9. And is the harm or potential damage
  10. that the model causes
  11. pretty substantive or substantial?
  12. And I think in the example of a home loan,
  13. that definitely applies to all three.
  14. Banks definitely will not open up their AI model,
  15. or decision-making model, to tell you how–
  16. – Not willingly.
  17. – Not willingly,
  18. how they made a decision.
  19. So, that would be quite rare.
  20. Secondly, the scalability of this is quite large.
  21. So, you can imagine some of the largest banks
  22. in the world
  23. with thousands or hundreds of thousands of clients,
  24. the impact or the scalability would be quite large.
  25. And then lastly, for each of those individual customers,
  26. the impact on their life
  27. could be huge.
  28. – Huge, yeah.
  29. – The difference between having a home
  30. and not having a home,
  31. what could be more fundamental
  32. to a person’s well-being,
  33. or idea of psychological kind of stability,
  34. than the opportunity, when they’re ready,
  35. to purchase a home?
  36. So, these are really fundamental things
  37. that I think we have to think about.
  38. Now, going back to the idea of recourse and the balance
  39. of do we trust the human, even though we have bias?
  40. Or do we trust the AI, even though
  41. that also has some level of bias?
  42. I think it’s a bit of a mix and I think we need both.
  43. I don’t think we can completely do away
  44. with the human element.
  45. – Right.
  46. – And rely completely on the AI,
  47. but we can’t go the other way,
  48. overboard the other way either.
  49. And I think in our seeking of efficiency of how we use AI,
  50. one of the ways we, one of the big draws
  51. is that it will hopefully make us more efficient, right?
  52. Some of the repetitive tasks and the things
  53. that take up a lot of our time,
  54. maybe we won’t have to do those anymore.
  55. But in our pursuit of this greater efficiency,
  56. I do think we still need to sacrifice a little of efficiency
  57. to keep a human element there.
  58. So, when we go back to the example,
  59. do you have any recourse?
  60. Well, what would be great is, if banks continue to have
  61. somebody there that a rejected applicant can go to
  62. and say, hey, I got rejected.
  63. I just wanna understand why.
  64. And then you could have somebody there to explain
  65. the process and potentially follow up to see
  66. if things were interpreted incorrectly.
  67. I think that would be the best of both worlds.
  68. Now, of course not a lot of organisations may be willing
  69. to do that,
  70. but I think there will be organisations
  71. that will be willing to do that.
  72. Particularly as we have these debates
  73. about how to balance this responsibility and this trust
  74. between the different stakeholders involved.
  75. – Yeah, and you say that banks or financial institutions
  76. wouldn’t willingly allow people to see their algorithm
  77. or other kind of inside data.
  78. I completely agree that they wouldn’t willingly do that.
  79. But I wonder if transparency really is the key
  80. to ensuring these things.
  81. And they do say sunlight is the best cure
  82. from the ethics perspective.
  83. I wonder if that is the eventual future of this.
  84. If we’re going to rely so heavily on these products,
  85. if we’re going to rely so heavily
  86. from a financial, entire industry perspective,
  87. I wonder if the eventual step would be, just like a patent,
  88. where you’re granted a patent but in return,
  89. you have to provide very public data on the creation
  90. and various components of that particular device.
  91. I wonder if that means that eventually,
  92. if you hit these three things, if it’s not transparent,
  93. if it’s really scalable, and if the potential for harm
  94. is very significant,
  95. I wonder if there would be
  96. either a public or even kind of private
  97. governmental disclosure required to show
  98. that there isn’t bias within the system?
  99. – Yeah, and I think that’s a really interesting point.
  100. And from a broader level,
  101. some people are talking about this
  102. in the context of large technology companies
  103. that have grown so much
  104. and become such a part of our lives.
  105. And maybe those companies shouldn’t
  106. just be considered just a normal company.
  107. – Right.
  108. – Because they’re so influential,
  109. but maybe we should regulate them
  110. like a financial company or even a public utility.
  111. – Even a utility.
  112. – That’s right.
  113. – And so, I think that’s part of the broader debate
  114. we’re having as a society to understand
  115. how we want to manage the influence,
  116. the increasing influence of these companies in our lives.
  117. – Okay, so that’s something to think about.
  118. From the standpoint of AI and artificial intelligence
  119. as these things become more and more ubiquitous
  120. and utilised around us all the time,
  121. is to think about how are they making life
  122. more transparent, more efficient and more unbiased?
  123. Or are they actually entrenching existing biases
  124. and therefore kind of further distancing
  125. certain segments of society from the financial markets
  126. and from financial inclusion?

Additional Readings

  • Hao, K. (2019). When Algorithms Mess Up, The Nearest Human Gets The Blame. MIT Technology Review. Retrieved from https://www.technologyreview.com/s/613578/ai-algorithms-liability-human-blame/ 
  • Sears, M. (2018). AI Bias And The ‘People Factor’ In AI Development. Forbes. Retrieved from https://www.forbes.com/sites/marksears1/2018/11/13/ai-bias-and-the-people-factor-in-ai-development/#5e9e77319134 
  • Bakst, J. (2019). BankThink: Don’t Let AI Become a Black Box. American Banker. Retrieved from https://www.americanbanker.com/opinion/dont-let-ai-become-a-black-box 
  • Whittaker, M. et al. (2018). AI NOW Report 2018. AI Now. Retrieved from https://ainowinstitute.org/AI_Now_2018_Report.pdf 

4.5.1 Social Credit

  1. Let’s look at another example
  2. and talk a little about credit,
  3. a pillar of the modern financial system.
  4. For many people in the world,
  5. credit is part of everyday life, ranging from credit cards
  6. to borrowing money from a bank to buy a home.
  7. For many, the ability to access and use credit
  8. is largely defined by a credit score,
  9. which ultimately gauges how likely a particular person
  10. is to repay the money they have borrowed.
  11. Conceptually, the better the score, the lower the risk.
  12. So in the United States,
  13. we have something called a FICO score,
  14. named after its creators,
  15. Bill Fair and Earl Isaac,
  16. who created the Fair Isaac Corporation,
  17. which initially produced these scores.
  18. To calculate a FICO score,
  19. different financial data such as,
  20. bank account information, existing debt levels,
  21. payment history, and other related information
  22. are used together to calculate a credit score.
  23. Many other countries now also have their own versions
  24. of these scores.
  25. In theory, the use of these scores is important
  26. because individuals can more freely access capital
  27. and other financial products since banks
  28. and financial institutions are more willing to lend money
  29. and likely at lower interest rates because they have this
  30. credit information.
  31. So a mature credit system makes accessing capital
  32. easier.
  33. And for many in Hong Kong, or the UK,
  34. or other countries with developed financial systems,
  35. the notion and use of credit is quite mature,
  36. a given, really, almost an afterthought.
  37. But what if there was no credit score
  38. for a financial institution or bank to assess
  39. your risk when you needed to borrow money?
  40. How might that impact you?
  41. Well, that bank may require you to pay a really high
  42. interest rate or pledge a lot of collateral,
  43. even for a small loan,
  44. or they might even require both.
  45. It was due to such challenges that microfinance
  46. lending organizations, like the Grameen Bank,
  47. founded by Muhammad Yunus, were formed.
  48. Now the issue of credit really becomes apparent
  49. when you consider there are approximately
  50. 2 billion people in the world that are unbanked.
  51. So this basically means,
  52. roughly 25% of the world’s population
  53. doesn’t have a bank account.
  54. Without access to the financial system,
  55. which for most people in the world is through a bank,
  56. then of course it’s extremely difficult
  57. to develop a credit history and a credit score.
  58. The lack of this information makes it difficult
  59. for the unbanked to access credit,
  60. which means to borrow money,
  61. leaving many mired in the same financial situation.
  62. So the exciting thing is that FinTech paired
  63. with mobile technology can help solve this conundrum.
  64. With the rise of mobile phones,
  65. and particularly smartphones,
  66. and the shift to digital banking,
  67. there’s a lot of opportunity.
  68. So for many of today’s unbanked,
  69. most of them may never, or at least rarely,
  70. access a traditional brick and mortar bank,
  71. but increasingly many will patron digital banks,
  72. even online-only banks,
  73. and other digital financial services
  74. via their mobile device.
  75. This is, and will be, incredibly empowering
  76. for many of the world’s neediest populations,
  77. and one of the great potential democratizing aspects
  78. for FinTech – giving people more opportunities.
  79. Now, for people that may be using mobile devices
  80. but still not yet fully integrated into the financial system,
  81. or with only minimal financial data,
  82. there is still the problem of trying
  83. to determine their credit.
  84. So one alternative to traditional forms of credit analysis
  85. is the rise of social credit.
  86. In its simplest form, social credit basically means that
  87. any kind of data, not just financial, can possibly be used
  88. to determine some level of credit.
  89. For example, your Facebook network,
  90. and your relationships there,
  91. the type of people you most frequently message
  92. on your phone,
  93. or the amount of time you spend watching Taylor Swift
  94. videos on your phone,
  95. and a whole host of other behavioral
  96. and relationship knowledge,
  97. that is not necessarily financial,
  98. can be utilized by AI-backed algorithms
  99. to compile a profile on you
  100. – a social credit profile –
  101. that may have an impact on your financial
  102. and social life.
  103. Sounds fascinating, but is this okay?
  104. What are the benefits?
  105. What are the risks?
  106. Aspects of social credit are being rolled out
  107. in various ways already.
  108. At a national level, China is implementing its own
  109. indigenous social credit system,
  110. a reputational score system that applies to
  111. individuals and companies,
  112. with the intention for it to eventually score all of
  113. its citizens when the system is fully developed.
  114. The early stages of this social credit system
  115. have already garnered attention as almost
  116. 10 million people have been banned
  117. from domestic air travel in China alone,
  118. and this is all based on their social credit score.
  119. Other potential impact includes,
  120. limiting access to certain educational opportunities,
  121. or employment.
  122. And social credit scores could even impact
  123. one’s Internet speed.
  124. It’s not just nation-states,
  125. but really private sector actors are leading the charge.
  126. Ant Financial, one of the world’s largest
  127. FinTech companies,
  128. and related to Chinese technology giant Alibaba,
  129. has also started developing its own form of
  130. alternative credit,
  131. dubbed “Sesame Credit.”
  132. In addition to traditional financial information that
  133. something like a FICO score might include,
  134. Sesame Credit also incorporates other information like
  135. the online behavior of a person,
  136. especially in the context of their activity
  137. within the Alibaba ecosystem.
  138. A high Sesame Credit score,
  139. improves the users’ “trust” level within the system
  140. and facilitates access to Ant Financial products.
  141. But you know, China is not the only place
  142. where social credit analysis is growing.
  143. Even in Silicon Valley,
  144. you can observe aspects of social credit.
  145. Dealing with myriad issues related to fake news claims,
  146. Facebook has developed its own rating system
  147. to gauge the reliability and trust of its users.
  148. And one criticism, however of this is,
  149. that even if such a tool might be necessary,
  150. it’s not transparent.
  151. And this is something we’ve discussed before about
  152. this idea of transparency.
  153. The use of social credit will continue to expand,
  154. either as a direct proxy or at the very least a supplement
  155. to traditional financial credit.
  156. Maybe nowhere is this more apparent than
  157. in the peer-to-peer (aka. P2P) lending market,
  158. which is another important part of the FinTech landscape.
  159. Many P2P platforms incorporate some aspect of
  160. social credit in their models.
  161. For example, one of the larger P2P platforms,
  162. Lending Club,
  163. which is listed on the New York Stock Exchange,
  164. was originally an application on Facebook that spun off.
  165. Prior to its IPO in 2014,
  166. Lending Club frequently mentioned that social
  167. relationships were an important part of its model
  168. and that social affinity and other non-financial factors
  169. helped lower the risk of non-payment.
  170. As P2P platforms grow, more data becomes available,
  171. and AI capability enhances, it will be interesting
  172. and important to consider how social credit will be used
  173. in the future to influence our lives.

Additional Readings

  • China’s social credit system. (2019). South China Morning Post. Retrieved from https://www.scmp.com/topics/chinas-social-credit-system 

4.5.2 Social Credit – Subjectivity of Morality

  1. So revisiting our earlier example
  2. about purchasing a home.
  3. Imagine, you go to a bank to get a home loan.
  4. And in addition to the financial information you give
  5. to the bank to evaluate your credit worthiness,
  6. they ask you for social information
  7. about your behaviours on your phone
  8. and your computer,
  9. what website you frequent,
  10. what kind of games you play, for how long,
  11. what kind of music videos you watch on YouTube?
  12. How would that make you feel?
  13. And how would that impact how you live your life
  14. on a daily basis?
  15. But Dave, how would you feel,
  16. if this was the situation you were in?
  17. – This is, I think,
  18. to be perfectly blunt,
  19. I think it’s kinda scary and it’s something that
  20. I don’t really get caught up in a lot
  21. of the more dystopian future of AI and stuff.
  22. I feel like we’re probably a long way off from that.
  23. But this is one area
  24. from a behavioural modification standpoint,
  25. I feel like there are pretty concrete examples historically,
  26. not even that long ago where broad scale social change
  27. through behaviour modification especially when looking
  28. at peer groups, family members, educational history,
  29. religious or other beliefs
  30. that lead to fairly broad dire consequences.
  31. So, let’s start with the good stuff.
  32. Let’s not be too negative.
  33. Some of the most successful example of this in Africa,
  34. in developing parts of Asia
  35. is a really really simple aspect
  36. of social credit which would be whether
  37. or not you pay your mobile phone bill.
  38. If you are in Kenya and you’re utilising one of the
  39. mobile banking payment platforms
  40. and you don’t have a bank account,
  41. whether or not you pay,
  42. your mobile phone bill each month is probably
  43. the best example of your credit worthiness.
  44. – So how likely you’ll repay.
  45. – Exactly.
  46. – Because you pay your phone bill for the last year.
  47. – I thought when that came out, I thought it was brilliant
  48. and I think you know, millions and millions of people
  49. have benefited from that aspect of social credit scoring.
  50. On the flipside,
  51. if you look at some of the other examples of this
  52. where they’ll look at browser history.
  53. They’ll look at how many hours you spend each week
  54. playing video games.
  55. They’ll look at
  56. what I would consider more moral decision making,
  57. that’s the type of stuff that concerns me I think.
  58. – Okay and so, is the concern about,
  59. when we think about morality,
  60. frequently people think about who gets to decide
  61. what’s moral or not?
  62. Is that the concern you’re referring to?
  63. – That’s exactly right.
  64. I mean think about it.
  65. If you’re home right, who is to say
  66. whether your behaviour is specifically good
  67. or specifically bad especially when we’re talking
  68. about accessing credit.
  69. Some of the examples they gave is
  70. playing video games is bad
  71. and so therefore people that play a lot of video games
  72. should be less worthy of credit.
  73. Even if I agreed with that on a personal level
  74. which I don’t necessarily, it’s very dangerous to think
  75. that a small group of people, probably men
  76. who we don’t really know who they are
  77. or what they’re discussing,
  78. they’re going to be the ones to determine what is moral
  79. and therefore what is acceptable in society
  80. and as we’ve already discussed,
  81. this can have extremely broadly implications
  82. in terms of whether or not you can buy a house
  83. or whether you can get a visa to travel
  84. outside the country
  85. or in some cases even determining what types
  86. of majors you can have,
  87. what types of careers you can enter into.
  88. – And I think returning to something you didn’t mention
  89. about the potential risk of the impact.
  90. We’ll start modifying how people act and behave.
  91. I think it’s really important.
  92. You know, philosophers from a long time ago
  93. to more modern philosophers have talked about
  94. this idea of what observation
  95. does to people’s behaviour.
  96. Even though nobody is coming
  97. and compelling you physically to do something,
  98. the fact that you feel like you’re being watched
  99. even you may not be being watched
  100. but the fact that you think you’re being watched,
  101. actually starts shaping your behaviour.
  102. And that is a very interesting as well as
  103. scary proposition.
  104. – Potentially and again, now again,
  105. not to be too negative here because the reality is that
  106. you and I, we generally conform to the best aspects
  107. of human behaviour.
  108. That is why as a species,
  109. generally speaking, we get better and better.
  110. There is less violent crime right now.
  111. We tend to mirror the best elements of our humanity.
  112. But I’ll give you quick example,
  113. so my father one time when we were,
  114. well not once, he used to say this a lot
  115. as I was young.
  116. He would take me to go perform service
  117. within the community
  118. and I like many teenagers would go quite begrudgingly
  119. and I’d been, you know, complaining the whole way.
  120. And he would say to me “If you don’t want to do this,
  121. then this is not going to be something that counts
  122. as a benefit to you.”
  123. Meaning that I had to actually want to do it in order for it
  124. to be service that would benefit me kind of spiritually
  125. or you know, psychologically.
  126. And so this runs into the question of
  127. when you are trying
  128. to modify behaviour from ethics context,
  129. can you compel people into a certain type of behaviour
  130. and then make them good?
  131. Can you compel people into goodness
  132. or do you have to educate them
  133. and inspire them into goodness?
  134. – I see that’s interesting.
  135. So the idea goes to kind of internal inherent motivation
  136. that the person has in the action
  137. even though both people may be doing good things,
  138. we actually think maybe the motivation
  139. for doing the good thing sets them apart.
  140. – Yeah and if history is any example,
  141. when societies have tried to compel a good type
  142. of moral behaviour,
  143. that often has led to really some of the most dire
  144. consequences socially speaking
  145. because people will not feel that inherent
  146. sense of shame or morality in their decision-making
  147. and instead they often look to avoid those things
  148. and then often become very disassociated at society.
  149. And it can create some very significant
  150. perverse incentives.
  151. – Okay, is that similar to this idea of a checkbox morality
  152. in a sense of if I’m doing these things that are supposed
  153. to be good in society,
  154. I’m a good person where in fact,
  155. just because you’re checking the boxes may not mean
  156. that you’re actually a good person.
  157. – Yeah, well there is two aspects of it.
  158. One is the checking the box,
  159. so therefore feeling like I’m good
  160. as long as I tick the box,
  161. then I’m therefore a good person
  162. and then anything outside those okay.
  163. It’s justified because I’ve ticked the boxes.
  164. But the second one which is slightly more pernicious
  165. is the idea that we’re ticking the box to tick the box
  166. but we know that that is not necessarily
  167. what our true intent or true desire is
  168. and that’s when I think some of the more malicious stuff
  169. can come through,
  170. and again there is examples of this historically
  171. where you could have examples of genocide
  172. or significant inequality that’s perpetuated
  173. simply based on false definitions of morality,
  174. you know, just to give an example
  175. for those who are confused at home,
  176. what if based on my sense of morality,
  177. I believe that a particular minority race
  178. was not worthy of voting,
  179. not worthy of financial credit,
  180. was not allowed to own property, right?
  181. I could say that god has told me this is the right thing
  182. to do and that is my definition of morality
  183. when in actual fact you know,
  184. we hopefully as society would say
  185. that’s actually a pretty terrible thing.
  186. – Yeah and to your point that’s happened myriad times
  187. – Many many times, very recently.
  188. – Across the history of humanity right?
  189. A lot of that discrimination potentially was based
  190. on religion.
  191. Some of that was based on how we look,
  192. where we were born.
  193. – Political perspective.
  194. – Exactly.

Additional Readings

  • Ma, A. (2018). China Social Credit System, Punishments and Rewards Explained. Business Insider. Retrieved from https://www.businessinsider.com/china-social-credit-system-punishments-and-rewards-explained-2018-4 
  • Milton Friedman – Freedom Not to Act. LibertyPen. Retrieved from https://www.youtube.com/watch?v=q84y08nu74I (video)

4.5.3 Social Credit – Accountability

  1. The other kind of concept that is interesting to me
  2. relating to social credit is, on one hand, you’re right.
  3. I think social credit has been incredibly enhancing
  4. for populations that can’t access
  5. traditional forms of financial credit.
  6. Thus, limiting them from accessing money,
  7. like you had talked about.
  8. – Yeah, yeah.
  9. – The thing that is a bit, not disturbing,
  10. but gives me pause about social credit
  11. is that we are using social credit as a proxy
  12. for financial credit, or financial data.
  13. – Yeah, yeah.
  14. – And whenever we use something that’s a proxy,
  15. generally, it’s rare that it’s one for one.
  16. So if we see a one here, and we wanna look at the proxy,
  17. that the proxy will also match up exactly.
  18. There’s usually some slippage, right?
  19. Or some parts that don’t overlap.
  20. – Yeah.
  21. – And what I’m afraid of is,
  22. if we inculcate the idea that using proxies
  23. are somehow the same thing as using the real thing,
  24. and that kind of leads into the ethos
  25. of how we think about AI
  26. and FinTech and these technologies,
  27. then we can really find ourselves in a situation
  28. where we assume that’s okay,
  29. but in reality, the proxy and the real thing
  30. actually don’t overlap that much.
  31. – Yeah.
  32. And then we become far,
  33. we move farther and farther away
  34. from actually what the real objective was.
  35. – Yeah, and I think, of course, that can be right.
  36. I think, on the flipside, there are a lot of examples
  37. where traditional credit scores
  38. have been shown to be problematic.
  39. False information, you know, identity theft,
  40. and anyone who’s had their identity stolen before
  41. knows how incredibly difficult
  42. it can be to clean up your credit.
  43. And so I think at the end of the day,
  44. what we’re saying is,
  45. social credit and other AI and machine learning-based
  46. credit rating systems
  47. can be incredibly, incredibly powerful,
  48. and can bring people to the financial markets
  49. that have never had access to it,
  50. but, like everything else
  51. that we’re talking about in this course,
  52. it requires those aspects of transparency,
  53. trust,
  54. proximity,
  55. to ensure we have the rules right up front,
  56. that we’re thinking about those things up front,
  57. so that we’re building a better system,
  58. and not entrenching these biases into existing systems.
  59. – And I think that is critical, what you just mentioned.
  60. Because what will happen,
  61. and what has already happened,
  62. even before the advent of AI
  63. and these other technologies is,
  64. if some sort of interesting process
  65. or non-AI technology came into existence,
  66. then it tended to get rolled out
  67. to other parts of the world.
  68. – Yeah.
  69. – And if a particular form of AI
  70. or some technologies that seems to be effective,
  71. then it’s very easy for that model,
  72. for that algorithm to start propagating
  73. into sectors and industries and geographies
  74. that it was never intended to do.
  75. – Absolutely.
  76. – But we just assume that it’s okay,
  77. because it worked well in California,
  78. or England, or Australia, or something like that.
  79. – Yeah, or a particular industry.
  80. Mortgages, use it for car loans, and stuff, yeah.
  81. – That’s right,
  82. and we assume that it will be a very easy transition
  83. across industries or sectors,
  84. where actually, that’s not necessarily the case,
  85. and in fact, there could be, cause more danger.
  86. So it goes back, this idea of scalability.
  87. Now we’re really scaling across the world,
  88. across geographies, across industries,
  89. and then across people, ultimately.
  90. – Yeah.

Additional Readings

  • Backer, L. C. (2018). Next Generation Law: Data Driven Governance and Accountability Based Regulatory Systems in the West, and Social Credit Regimes in China. Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3209997 

4.5.4 Social Credit – Privacy

  1. We have another question for you.
  2. What are the implications of such systems?
  3. So Dave, what do you think?
  4. What are the implications of these social credit systems
  5. that we’re talking about that are backed
  6. and powered by AI?
  7. – Well okay so let’s go back to
  8. what we were talking about with say Taylor Swift
  9. and (Jacky) Cheung and these concerts right.
  10. – Oh always, that’s always an interesting topic.
  11. – The idea that for example
  12. what we talked about back then was
  13. from a security standpoint, the guy gets caught right?
  14. So that’s a good thing
  15. and one of the questions you asked me was,
  16. well, why should it be a big deal
  17. if the bad guy gets caught?
  18. You know we shouldn’t want him there in the first place.
  19. My response then is really
  20. how I relate to this now is,
  21. yes absolutely, we want to have as secure
  22. an environment as possible,
  23. but I think we at least have to ask at what cost?
  24. Right, the idea being that if we are granting
  25. this incredibly broad level of,
  26. not granting we don’t have any control over it,
  27. but if there’s this broad level of surveillance.
  28. If we have this mobile technology
  29. on us all the time,
  30. and if we are now going to be introducing
  31. social aspects of behaviour into not only,
  32. I mean really rating us, literally rating us.
  33. I think these are some things that
  34. we at least need to think about
  35. collectively as society to understand
  36. what are we giving up for this right?
  37. – Okay now, that’s an important question.
  38. That we all need to consider I think about.
  39. What is the sacrifice we’re willing to make
  40. to have this increased security.
  41. In the context of the concert that we talked about
  42. where this massive video surveillance.
  43. I think one compelling aspect of this is
  44. the AI is powering the video surveillance
  45. making it much more effective and now
  46. we have aspects of social credit about our behaviours.
  47. So I think in the future you could
  48. easily see a situation where those are paired.
  49. – Yeah.
  50. – As a broader way to surveil or control populations.
  51. – We might have higher or lower credit
  52. because you like Taylor Swift.
  53. – That’s right.
  54. – Right.
  55. – And that composite is created
  56. through many more data points now.
  57. Through surveillance that’s happening,
  58. where you’re frequent,
  59. how frequent you go to 7-Eleven.
  60. – Yeah.
  61. – On a particular day and particular time, right.
  62. Or how frequently do you stay out late at night?
  63. All these things are now going
  64. to be able to be captured more through observation,
  65. as well as through the actual data
  66. that you are creating through your own
  67. usage of various devices.
  68. – Yeah.

Module 4 Conclusion

  1. So, what does the future hold?
  2. Of course, I mean, no one really knows exactly.
  3. But what is certain is that AI is gonna be a big part of it.
  4. Famed inventor and noted futurist Ray Kurzweil,
  5. predicted a technological singularity will be reached
  6. in 2045.
  7. Such a singularity basically represents a future
  8. where AI powered super-human intelligence
  9. is so powerful that it will create even more
  10. innovative technologies,
  11. which could possibly lead to very new realities
  12. that change our assumptions about intelligence
  13. and perhaps the nature of our very existence.
  14. This could be the path to the utopian existence
  15. only portrayed in science fiction movies.
  16. Now that said, there are those,
  17. like Nick Bostrom, a philosopher at Oxford,
  18. and Director of the Future of Humanity Institute,
  19. and also author of Superintelligence,
  20. or Elon Musk, who most people know
  21. that have expressed serious reservations
  22. about the post-singularity level future.
  23. Even Stephen Hawking once mentioned,
  24. ”The development of full artificial intelligence
  25. could spell the end of the human race.”
  26. Wow, that sounds scary.
  27. Well, this doomsday scenario is largely motivated
  28. by the possibility and fear that AI
  29. may become so advanced that
  30. we may not be able to control it,
  31. and perhaps such AI will eventually want
  32. to manage and control us.
  33. Now additionally, there are other AI related concerns,
  34. many of which have already been touched on
  35. in this module,
  36. ranging from fairness to privacy to displaced labor.
  37. As an AI-dominated future becomes more imminent,
  38. communities of concerned technologists, lawmakers,
  39. and other interested parties are coming together
  40. to grapple with and define the ethical issues
  41. surrounding AI.
  42. This is happening in China with the formation of a
  43. national level AI ethics committee
  44. and other examples include the European Union’s
  45. High-Level Expert Group on AI,
  46. which has released its own guidelines
  47. on the ethics of AI,
  48. and even in the US,
  49. there’s an organization called The Partnership on AI,
  50. which is a collection of leading global companies and
  51. institutions working together,
  52. for a stated purpose of
  53. “shaping best practices, research, and public dialogue
  54. about AI’s benefits for people and society.”
  55. So what’s next?
  56. In his best-selling book, Zero to One,
  57. well-known Silicon Valley entrepreneur and investor,
  58. Peter Thiel wrote:
  59. “…humans are distinguished from other species
  60. by our ability to work miracles.
  61. We call these miracles technology.”
  62. AI makes the possibility of such miracles
  63. much more real.
  64. And ultimately, the future is not fixed,
  65. nor its outcome certain, and because of that
  66. –each of us, you and us,
  67. have the opportunity to shape the future.
  68. And hopefully this module has compelled you
  69. to further consider the parameters
  70. and maybe even limitations,
  71. that we need to place on these technologies
  72. that have the potential to do so much
  73. but potentially at great cost as well.
  74. Additional Readings

    • Bostrom, N. (2014). Superintelligence: Paths, dangers, and strategies. Oxford: Oxford University Press. 
    • Hauer, J. (2016). The Funny Things Happening On the Way to Singularity. TechCrunch. Retrieved from https://techcrunch.com/2016/04/09/the-funny-things-happening-on-the-way-to-singularity/ 
    • Metz, C. (2018). Mark Zuckerberg, Elon Musk and the Feud Over Killer Robots. The New York Times. Retrieved from https://www.nytimes.com/2018/06/09/technology/elon-musk-mark-zuckerberg-artificial-intelligence.html 
    • Metz, C., & Isaac, M. (2019). Facebook’s A.I. Whiz Now Faces the Task of Cleaning It Up. Sometimes That Brings Him to Tears. New York Times. Retrieved from https://www.nytimes.com/2019/05/17/technology/facebook-ai-schroepfer.html
    • Hawking, S., Russel, S., Tegmark, M., & Wilczek, F. (2014). The Independent. Retrieved from https://www.independent.co.uk/news/science/stephen-hawking-transcendence-looks-at-the-implications-of-artificial-intelligence-but-are-we-taking-9313474.html 
    • Ethics Guidelines for Trustworthy AI: High-Level Expert Group on Artificial Intelligence (2019). European Commission. Retrieved from https://ec.europa.eu/digital-single-market/en/news/ethics-guidelines-trustworthy-ai 
    • Knight. W. (2019). Why Does Beijing Suddenly Care About AI Ethics? MIT Technology Review. Retrieved from https://www.technologyreview.com/s/613610/why-does-china-suddenly-care-about-ai-ethics-and-privacy/ 
    • Peter, S., et al. (2016). Artificial Intelligence and Life in 2030. One Hundred Year Study on Artificial Intelligence: Report of the 2015-2016 Study Panel. Stanford University. Retrieved from  http://ai100.stanford.edu/2016-report 
    • Araya, D. (2019). Who Will Lead in the Age of Artificial Intelligence? Brookings Institute. Retrieved from https://www.brookings.edu/blog/techtank/2019/02/26/who-will-lead-in-the-age-of-artificial-intelligence/ 

Module 4 Roundup

  1. – Hi, welcome back.
  2. Module four’s round up.
  3. We’re excited this week for two reasons.
  4. One, as we mentioned last week,
  5. artificial intelligence is one of our favourite topics
  6. and there’s a a lot of implications there
  7. about the future, about society
  8. and a lot of things are really important to us.
  9. So hopefully, its been meaningful for you
  10. as you’ve gone through the module.
  11. But perhaps most importantly, we’re here in Hong Kong.
  12. David Bishop and I are teaching a class this week
  13. related to FinTech.
  14. And we were kindly joined by some of our great students
  15. to join us in this week’s roundup
  16. to discuss some of the questions that you’ve shared for us.
  17. So maybe David, you can kick us off
  18. with our first question
  19. – So as always we’ve loved the comments you guys have had
  20. really appreciate you sending them out there
  21. some of the comments you’ve had
  22. we kinda wanna throw out to our class.
  23. They’re from all over the world, really diverse group.
  24. And so the first comment that we had was about surveillance
  25. and really relates to some of the AI things
  26. we’ve been talking about this week.
  27. We’re constantly surveilled,
  28. there’s cameras everywhere,
  29. there’s ATMs on the street.
  30. So what do you think about that?
  31. What are some of your thoughts?
  32. Is it a little bit scary, is it better
  33. because it makes the world safer?
  34. What are your perspectives on this
  35. in terms of the utilisation of facial recognition
  36. and A.I. in our everyday lives?
  37. – Yeah so I guess it depends where you’re coming from
  38. in the world.
  39. If you’re in the States, I think people
  40. would be really scared and kind of be against this.
  41. If you’re from China, some views are
  42. that if you have nothing to hide,
  43. you have nothing to be scared of.
  44. But I think there’s two main things.
  45. The first thing, if you’re from the States in particular
  46. is your rights, your freedom.
  47. Some people have been seen in the news
  48. that they try to cover their faces
  49. when they see a camera
  50. and police officers actually force them
  51. to show their faces, – Right
  52. which I think is not right so to speak.
  53. – Yeah
  54. Another thing is these cameras in the public
  55. can be tampered with.
  56. I think they can be used as blackmail
  57. with police officers and law-enforcement.
  58. For high, high-level figures in the world.
  59. But there’s also pros with having it as well.
  60. – Yeah
  61. – So you know, uh, public-safety…
  62. – So you personally, how do you feel?
  63. – Personally, I think it’s okay
  64. I have nothing to hide.
  65. But I do feel that, ’cause I am from both
  66. Hong Kong and the United States
  67. that if I don’t wanna show my face on the camera
  68. I shouldn’t be forced to.
  69. – Yeah
  70. – But I don’t have an issue with it.
  71. – Yeah so for the class out there
  72. what he’s referencing is during our class here
  73. we actually showed a video,
  74. from London of all places,
  75. which kinda surprised us.
  76. Just from three weeks ago,
  77. where they had set up a police area
  78. outside on the street, and they were requiring
  79. everyone walking by to go
  80. through facial-recognition software
  81. and a gentleman didn’t want that
  82. he thought it was an invasion of his rights,
  83. so he pulled up his jacket over his face,
  84. he pulled his hat down
  85. and the police actually used that as probable cause
  86. to detain and question him.
  87. And so it’s kind of a catch-
  88. like lose-lose scenario.
  89. Either, you let them scan your face against your will
  90. or you run the risk of them using that as probable cause
  91. in order to question you.
  92. So Cameron’s saying that generally speaking,
  93. not a big deal if they scan us and stuff
  94. because it keeps us safer, nothing to hide.
  95. But, if somebody wants to hide their face,
  96. you think that’s perhaps what they should be able to do.
  97. Is that fair? – [Cameron] Yeah
  98. Okay so Kate, what do you think?
  99. ‘Cause you’re from Shanghai, from China,
  100. cameras everywhere, what is your take on this?
  101. – I think I still think that, uh, how to say–
  102. I’m going to focus on–
  103. intuitively it’s very scary, right?
  104. Everything you have that shows on surveillance
  105. and everything.
  106. But, actually surveillance happens
  107. like hundreds, hundreds years in different
  108. different ways.
  109. – Yeah
  110. – So I still think the technology
  111. or say the tool, is not the centrepiece.
  112. The centrepiece is that we be mindful
  113. of this kind of risks, like Cameron just said.
  114. And to do the right thing.
  115. – Okay so one twist on this question
  116. and I’m curious if anybody has any thoughts.
  117. What we haven’t talked about in this course yet
  118. is the introduction
  119. of these types of surveillance
  120. with deep fakes.
  121. So, some the newer technologies,
  122. they’re able to take
  123. anyone’s face and then put another person’s face
  124. on that video
  125. and it looks extremely realistic.
  126. They’re using this in Hollywood,
  127. really extensively now.
  128. And the technology’s getting cheaper and cheaper.
  129. So in– it’s likely that certain people
  130. could potentially be framed for a crime
  131. or use these types of external surveillance
  132. maybe, against someone that they wanted to harm
  133. in some way.
  134. Does that give you any additional pause?
  135. Or is this just something that’s
  136. you know, maybe, something we can’t
  137. do anything about?
  138. – I think it’s inevitable that it will happen
  139. in some form or another.
  140. It’s really up to governments and regulators
  141. to have tight controls on this sort of
  142. misuse of AI and so-forth.
  143. But, I have faith in governments around the world
  144. that they’ll be able to control it
  145. and ensure that the usage
  146. is for the appropriate reasons.
  147. – Yeah, so our next question
  148. is about how do we regulate some of these technologies,
  149. so we’re talking about artificial intelligence,
  150. David brought up this twist on that previous question
  151. about deep-fakes.
  152. And how do we regulate these?
  153. There’s different countries in the world,
  154. different jurisdictions
  155. – Diverse group here
  156. – A diverse group here,
  157. and, people have different opinions on this
  158. so, you know, will there ever come a time
  159. when perhaps we can have a uniform rule or regulation
  160. that will cover this globally?
  161. Is that something that we…
  162. practically that could be difficult,
  163. but is that something we aspire to do?
  164. So, maybe we start with that question.
  165. Because it’s quite important
  166. as a fundamental starting point
  167. in how we think about,
  168. perhaps regulating some of these technologies.
  169. Do any of you have any thoughts about that?
  170. – Yeah, I guess a lot of companies nowadays
  171. how they operate is not really restricted
  172. to a physical space.
  173. I think back in the days, you know,
  174. you think about the retail shop,
  175. then it’s confined into a physical space.
  176. Let’s say, if a shop is operating in New York
  177. then they follow New York law.
  178. And then if they’re operating in California,
  179. they follow California law.
  180. But let’s say, nowadays, everyone’s shopping in Amazon
  181. everyone’s shopping in different online websites
  182. and at least in the states,
  183. the bar-exam and then the law
  184. is kinda specific to each state
  185. and then if some sort of a crime
  186. some sort of incident that happens
  187. what jurisdiction’s law do we go by?
  188. And then if we go beyond that, one country
  189. people, we buy things from, I buy things
  190. from the UK, I buy things from Hong Kong
  191. then, who are the law-makers
  192. or the regulators
  193. to really regulate,
  194. or what law to follow
  195. or what guideline to follow
  196. and when there’s some
  197. inter-country incident happening
  198. which guideline is like, is the, I don’t know,
  199. like the golden-rule?
  200. – Mm, yeah
  201. – To dictate, so I guess–
  202. – It’s really complicated, yeah.
  203. – So do you think there should be a universal law
  204. at some point
  205. would that be the best way to deal with this?
  206. Or, can we rely on law
  207. as kind of the way we deal with this,
  208. or should we— are there other mechanisms
  209. perhaps, that we should think about?
  210. – Maybe we need a supreme AI overlord
  211. who can just determine all those things for us,
  212. maybe.
  213. – Yeah, I mean, I don’t know
  214. that’s a good question, I mean
  215. I don’t have a answer to this, I guess.
  216. – Do you think that it’s even feasible
  217. that there will be like global standards?
  218. – [Carl] So, yeah, for instance
  219. it should be very complicated,
  220. like even for nuclear weapons,
  221. we not all agree about it.
  222. So, how it’s gonna work for FinTech or AI
  223. it’s gonna be very complicated I guess
  224. – Yeah
  225. Now, there’s a lot of money involved though
  226. and so you will see that typically where money is involved
  227. and where cross-border commerce is involved
  228. those are the rules
  229. that are typically the most uniform.
  230. So I think the best example would be intellectual property
  231. and you have really large
  232. kinda multi-national organisations
  233. like the WTO, or other global bodies
  234. that kind of force companies into obeying certain rules.
  235. Do you think that the EU, and the US
  236. and China perhaps
  237. would be powerful enough at some point
  238. to kind of force everybody into adopting, like…
  239. ‘Cause you don’t have to convince everybody
  240. you just need to convince a couple core powers
  241. maybe that could be plausible?
  242. – It’s complicated nowadays,
  243. like when you see already the trade-war
  244. China versus US, who gonna take care?
  245. Who gonna take over?
  246. Who gonna decide?
  247. This is a good question.
  248. – Yeah
  249. – So…
  250. – Okay
  251. – Yeah
  252. – So I think the MiFID II example
  253. is quite interesting actually
  254. because as, Shannon was talking about
  255. The PII, so personal identifiable information
  256. and uh– because that actually ties directly
  257. in to this idea of artificial intelligence
  258. is the data that’s there. – Yeah
  259. – Because that qualifies PII,
  260. – Yeah
  261. – So that actually becomes
  262. quite an interesting question to think about.
  263. Because if it does then it will fall
  264. automatically under an existing regulation
  265. – Yeah
  266. – And if it doesn’t then, why not?
  267. Because, we could say our birth-date, address,
  268. national-identity number, you know,
  269. is personally identifiable information,
  270. but clearly our face should be a form of PII too, right?
  271. So it does raise some interesting questions.
  272. – Yeah and I do think it also gives me some hope
  273. that there is the potential
  274. for more uniform guidelines going forward.
  275. Because if you think of capital markets, right?
  276. There’s a lot of uniformity,
  277. because if you have a foreign, or overseas company
  278. that’s listed, say, on a USA stock-exchange
  279. then they have to adopt some of those rules
  280. contract rules are fairly uniform.
  281. Again, intellectual property.
  282. Product liability rules, even like development
  283. and production of products.
  284. So, basically if these countries
  285. do wanna do business back-and-forth
  286. and AI and information-technology and stuff
  287. is gonna be more cross-border in nature
  288. so I think it is actually very plausible
  289. that there will be some type of standards going forward.
  290. I guess that the question is
  291. who’s gonna be able to push those things
  292. and enforce them, right?
  293. Enforceability is always, I think,
  294. the biggest challenge when you’re dealing
  295. with cross-border things.
  296. Yeah
  297. – There’s different definitions of PII
  298. there’s no single definition of PII.
  299. – Right, currently, Yeah. – So, yeah
  300. I mean, one country can impose
  301. like thirty different fields
  302. that’s PII. – Yeah
  303. But another country may think
  304. that that’s not invasive,
  305. so who are–who’s the authority
  306. to say what’s the right level of control
  307. what are true PII, if there are such thing
  308. – Yeah
  309. and what are some bogus PII?
  310. – Yeah, and this actually is a good tangent
  311. to our third, kind of, topic.
  312. Because, one of the topics that students were commenting on
  313. was really about authority and power.
  314. And, specifically in terms of
  315. introducing AI in order to reduce human-bias
  316. but unfortunately the data
  317. that creates a lot of AI data-sets is already biassed,
  318. and so potentially could re-entrench existing bias.
  319. And so, I’m just curious from your perspective
  320. So, some of the things we’ve talked about
  321. include mortgages
  322. that are built off-of
  323. perhaps biassed, racist data
  324. then create AI systems where the computer
  325. then decides to give worse loan terms
  326. to a minority, let’s say.
  327. Kenny has a technology background,
  328. do you think that introducing artificial intelligence,
  329. machine-learning, and these kind of
  330. amoral, non-human actors is going to increase
  331. a more fair and transparent system,
  332. or is the data so tainted
  333. that it’s only gonna entrench these human biases
  334. and kind of, make even worse outcomes for some people.
  335. – From a technology point of view,
  336. as we know the AI model is actually coming from,
  337. you need to fit in a lot of data,
  338. okay, and those datas are based on the past history
  339. for example, from a bank,
  340. the mortgage-approval history transactions
  341. So if there is some bias
  342. in the very first place
  343. for example, the approval manager keeps rejecting
  344. mortgage applications because of the race,
  345. because of the background
  346. then those data will actually fit into the model
  347. and then the model, will have the bias.
  348. – [David And Teacher] Yeah
  349. – And then, but, the thing is,
  350. in the ethical point of view,
  351. what about the bank, or the government?
  352. Who’d like to change this kind of bias.
  353. Okay, so this is quite hard to say
  354. but okay, this is fine, we are used to do this
  355. this is what we expected, the result
  356. or, “hmm, this is not good,
  357. we need to change it”
  358. I guess it depends on the bank
  359. or the government,
  360. how they treat this
  361. to make it a fair judgement .
  362. – Yeah, okay great
  363. and as someone who’s again, in technology
  364. do you think that there are things
  365. that we can do now
  366. to hopefully ensure a more transparent
  367. and ethical system?
  368. – Of course I believe the government
  369. has to take the lead.
  370. To educate the banks, the organisations
  371. who use AI.
  372. They have to promote the fair use,
  373. or the ethical side of AI use and AI.
  374. And then, regulation also
  375. from different point of view.
  376. Discrimination, racism, different kind of the aspects
  377. They have to set very clear direction and regulations
  378. But it takes time.
  379. To do that.
  380. – What can we do?
  381. Let’s assume– I think, everybody that has
  382. thinking about these AI type of issues
  383. about data-bias and things like that
  384. I think we all understand
  385. that the basic issue in terms of
  386. if you get bad-data in, bad outcomes potentiate.
  387. But let’s say that happens though
  388. unintentionally.
  389. What is the process to address that?
  390. Okay, Carl, Yeah.
  391. – So, in fact, the good first thing
  392. is to be able to realise
  393. that thanks to AI, we are going things wrong
  394. in the wrong way.
  395. So we are making mistake,
  396. and to recognise mistakes.
  397. And then, we gonna be able to build on that.
  398. And then to recognise that with ethic,
  399. we gonna be able to build the right model
  400. not to have discrimination against,
  401. about gender, about racism or whatever.
  402. Think it’s a good point.
  403. And to bring new data to build a model
  404. which gonna make the world,
  405. let’s be uh…
  406. The world a better place
  407. (laughter)
  408. – There we go, a typical Silicon-Valley mantra
  409. okay, so kinda, concluding thoughts
  410. give us like, one or two lines
  411. of why maybe you’re excited about AI
  412. and its potential for the future.
  413. Do you wanna give us an idea?
  414. What is it about AI that kind of excites you
  415. from a FinTech standpoint?
  416. – From a FinTech standpoint,
  417. we think it’s actually making our life better
  418. in many ways.
  419. Like the payment system has been changing a lot
  420. over the past few years.
  421. But, to a certain extent I found that
  422. with the AI there are a lot of data-sharing issues,
  423. privacy issues that also arise at the same time.
  424. So there is always to be a balance
  425. between the AI usage as well as how it should be used
  426. in an ethical way.
  427. – So based on what we’ve discussed already,
  428. I think that one of the most powerful aspects of FinTech
  429. is the inclusivity it brings to people
  430. and especially to people
  431. who don’t have access to traditional financial services
  432. and that’s one of the biggest benefits.
  433. I come from an emerging market, Bangladesh
  434. and we talked about that at length in this course.
  435. How FinTech has brought so much access to financial services
  436. to the un-banked.
  437. And I think that’s one of the greatest potentials
  438. for all the risks such as bias
  439. and other things that marginalise people,
  440. there’s a huge positive to FinTech
  441. that actually brings people access to financial services
  442. to improve their quality of life.
  443. – Yeah it’s interesting so I’ve always
  444. thought of Bangladesh as a good example
  445. because, University of Hong Kong,
  446. we’ve got a great university,
  447. and yet no Nobel-prize laureates,
  448. and yet University of Dhaka,
  449. also a great university,
  450. but I think, two,
  451. and a lot of times
  452. the most beneficial emergent technologies
  453. happen in emerging markets.
  454. Because it’s really just out of necessity, right.
  455. So if you think of Muhammad Yunus
  456. he’s very explicit, he’s like
  457. “I wasn’t trying to invent Microfinance,
  458. I was just trying to serve the needs
  459. of a lot of people”
  460. And so I think some of it for me,
  461. again, I love the developing space as well
  462. and I think some of the most interesting utilizations
  463. of these technologies are really in that space
  464. because the potential impact is just astronomical
  465. it’s really cool, yeah.
  466. – And that’s also brilliant in the sense of,
  467. going back to the topic of eliminating bias in AI
  468. because, what, David you were talking about
  469. the bottom of the pyramid.
  470. They’re collecting data from the bottom of the pyramid.
  471. – Yeah
  472. – They’re not collecting only biassed data
  473. from rich people or privileged people
  474. so that data is going to be incredibly powerful
  475. towards contributing to less biassed AI
  476. – And, just to clarify
  477. I’m not sure if this is what you meant, but
  478. there has not been data collected on them
  479. in this way before,
  480. which means it’s completely untainted.
  481. Hopefully.
  482. If it’s done properly.
  483. So you’ve just got a clean slate, move-forward.
  484. Yeah, great.
  485. So any other kind of final comments that anybody has?
  486. Or excitement about AI?
  487. – I think not only FinTech, but the technology in general
  488. now drive a lot of attention,
  489. so there is a lot of discussion,
  490. just like what we do today,
  491. actually it’s a positive element
  492. to revisit to the ethic and the risk
  493. and also the current system.
  494. What is the issue with the current system?
  495. Why the current system,
  496. for example the financial system,
  497. didn’t serve some…
  498. – Yeah
  499. – People – Yeah
  500. – Where’s the gap?
  501. And then, challenging the business as usual
  502. actually can have a lot of potential
  503. to add a lot of value to the economy
  504. and also as a expense.
  505. – Yeah
  506. – Great, well
  507. we really appreciate our students for joining us
  508. in this roundup
  509. We really wish all of you in our online course
  510. we could actually meet with you in this kind of
  511. setting and capacity and kind of engage with these ideas
  512. Hopefully we’ll be able to continue to do this
  513. in some ways further moving on
  514. and we really look forward to connecting again
  515. after module five,
  516. which is really important as well
  517. as we revisit some of these questions
  518. in a broader structural sense
  519. – Talk to you next week
  520. – Thank you

Module 5 A Decentralized Future

5.0 Module 5 Introduction

  1. – Welcome to module five.
  2. In this module,
  3. we’re gonna talk about some of the key reasons
  4. that people are calling for FinTech innovation,
  5. and one of those main things is the decentralised status,
  6. the nature of this,
  7. thus democratising finance and allowing regular people
  8. to participate more fully and affordably
  9. in financial transactions
  10. through technologies like cryptocurrency,
  11. non-government-issued IDs, peer-to-peer lending,
  12. things of that nature.
  13. And so in this module we’re gonna address
  14. some really big questions,
  15. considering whether FinTech should lead
  16. to a decentralised, democratised system of finance,
  17. or whether existing institutions
  18. will adopt FinTech strategies
  19. to cement their existing hold on financial markets.
  20. During this module we’re gonna discuss these major themes,
  21. including the perceived desire
  22. for democratisation of goods and services.
  23. Is that good or bad?
  24. Will there be unintended consequences?
  25. So a lot of this is gonna be a continuation
  26. of things that we’ve talked about in other modules,
  27. including module two where we talked about blockchain.
  28. So we’ll be referencing back to those from time to time.
  29. And one of the key things that we’re gonna be emphasising
  30. is the sources of power.
  31. So, for example, will FinTech innovation
  32. lead to a decentralisation of power,
  33. or maybe a concentration of existing power sources
  34. like governments and banks?
  35. Or will a new concentration of power really be created
  36. in TechFins like Amazon or Tencent,
  37. which owns an app like WeChat, for example?
  38. Okay, we’re gonna start module five with a quick story.
  39. Now, I’ve been living and working within China
  40. for over a decade
  41. and travelled there many times.
  42. Last year, I had the opportunity to go
  43. to a small part of Western China that I’d never been before.
  44. So imagine kind of a rural desert landscape
  45. with kinda dust everywhere.
  46. In the morning I decided to not eat breakfast in the hotel
  47. and instead went out to get something on the street.
  48. Now, I noticed that there was a particular street vendor,
  49. an old woman,
  50. who you could tell she’s been doing this
  51. for a very long time
  52. and she was surrounded by people,
  53. so it was obvious that her food was quite popular.
  54. So, I went over there quite excitedly,
  55. I kinda watched what everybody was doing,
  56. and then when it came to be my turn I ordered some food.
  57. Thankfully I can speak Chinese
  58. and so that part of it was easy from a cultural standpoint.
  59. As I reached into my wallet
  60. and started to pull out some cash,
  61. I could immediately see the concern on her face
  62. when we both had the realisation
  63. that every other single person in that circle
  64. had paid using their phone
  65. and she did not have the ability
  66. to give me change with cash.
  67. So here I was,
  68. probably the person who had most access
  69. to the financial industry,
  70. and yet I was the one
  71. that was completely cut out of the transaction
  72. and out of this marketplace.
  73. Now, I found this experience,
  74. even though I couldn’t get my breakfast
  75. and I was a little upset about that,
  76. I found this experience super cool
  77. because this community in rural Western China
  78. had in a short amount of time
  79. moved almost completely away from cash.
  80. And indeed anyone that’s been to China recently knows
  81. that most communities are the same way,
  82. and the government, for those that have been there
  83. know that they’re actually very supportive of this change.
  84. So, we find this story is gonna be a good summary
  85. not only for the things that we’ve discussed
  86. in modules one through four
  87. but also it’s gonna be a nice transition
  88. to help us start asking some of the bigger questions
  89. that we’re gonna be analysing in modules five and six.
  90. So, before you move on to the next video,
  91. we just wanna ask you, think about the story,
  92. and from a FinTech standpoint
  93. what are some of the observations that you have,
  94. specially about how FinTech is impacting local people,
  95. average people in rural communities or advanced communities
  96. all over the world.

Additional Readings

  • He, D., Leckow, R., Haksar, V., Mancini-Griffoli, T., Jenkinson, N., Kashima, M., Khiaonarong, T., Rochon, C., Tourpe, H. (2017). Fintech and Financial Services: Initial Considerations.  IMF Staff Discussion Note. Retrieved from https://www.imf.org/~/media/Files/Publications/SDN/2017/sdn1705.ashx
  • Chuen, K., & Lee, D. (2017). Decentralization and Distributed Innovation: Fintech, Bitcoin and ICO’s. Stanford Asia-Pacific Innovation Conference. Retrieved from http://dx.doi.org/10.2139/ssrn.3107659
  • Magnuson, W. J., (2017). Regulating Fintech. Vanderbilt Law Review, Forthcoming; Texas A&M University School of Law Legal Studies Research Paper No. 17-55. Retrieved from https://ssrn.com/abstract=3027525

5.1.1 Is FinTech Leading to Inclusion or Exclusion?

  1. – Okay, welcome back.
  2. Now although that was a quick story,
  3. we hope that you had a chance to kind of think about it
  4. because we believe a lot can be observed from it.
  5. Now let’s consider a few of these things.
  6. – Probably the most immediate observation
  7. is something that we’ve already discussed,
  8. the scale and penetration of FinTech innovation
  9. is faster and broader than anything we’ve seen before.
  10. Everyone on the street has a modern smartphone,
  11. and they had all adopted the technology
  12. into their daily routine.
  13. One reason this is true is because FinTech innovation
  14. can lead to efficiencies,
  15. which in turn can help a lot of people.
  16. And as discussed in module two, FinTech innovation
  17. can help cut out the middleman which saves costs.
  18. And for the street vendor using an app payment system means
  19. not needing to handle cash,
  20. which likely means to reduce risk of theft
  21. or for food worker is more sanitary.
  22. And in many cases, using cashless payments,
  23. it’s just faster and more efficient, leads better service.
  24. So basically, using a mobile payment system,
  25. help her business be more efficient
  26. and hopefully more profitable.
  27. – But FinTech innovations can also lead to exclusion.
  28. I probably had the greatest access to traditional finance,
  29. whether cash, credit and other loans,
  30. that anyone on that street thought they had,
  31. yet I was almost completely excluded from the marketplace,
  32. not even able to purchase breakfast.
  33. Like in this example, FinTech can lead to separation
  34. from financial markets, and therefore basic necessities.
  35. Although buying breakfast is a simple transaction,
  36. there are many layers of filtering in the story.
  37. For example, if you are required to pay with the phone,
  38. then guess what, you have to have a phone.
  39. And then you have to have the app WeChat
  40. and then an account on WeChat,
  41. and then money in or credits in that account.
  42. So one of the interesting challenges
  43. in the FinTech industry faces
  44. concerns access to these technologies.
  45. While many are hopeful that FinTech innovations
  46. will lead to better access to finance for the masses,
  47. others are concerned that it could also lead
  48. to increase exclusion from basic services.
  49. And this can be particularly true if governments decide
  50. to intentionally exclude some people from these platforms.
  51. – Now, going back to the example
  52. of the street vendor in China,
  53. we now ask a big question, the big question of the day.
  54. Will these innovations in FinTech
  55. bring the world closer together
  56. through multinational FinTech solution
  57. or will we become more and more isolated from each other?
  58. For example, credit cards have made it easy
  59. to make purchases around the world, no matter where I am.
  60. You know, I feel confident,
  61. I can make necessary payments using your credit card,
  62. though sometimes that means maybe paying high fees.
  63. – But on the other hand, David and I’ve been travelling to
  64. and teaching in China for many years.
  65. And while we love travelling there,
  66. from a FinTech perspective, every year,
  67. it seems more and more insular
  68. and disconnected from the rest of the world
  69. because of this simple paradox,
  70. the better their app ecosystem gets
  71. and the more people in China become interconnected
  72. via these apps, they’re simultaneously distancing themselves
  73. from the rest of the world.
  74. Now, this was made clear in the exercise
  75. with the Chinese street vendor,
  76. and this is happening in other countries too.
  77. Okay, so let me ask you another question.
  78. Do we think that FinTech is bringing the world together
  79. or pushing us further apart?

5.1.2 Money and Currency – Trust and Power

  1. – Another interesting lesson from the street vendor example,
  2. was that value was being transferred,
  3. but was this money in a traditional sense?
  4. It wasn’t physical currency, that’s for sure.
  5. But there was currency backing that transaction,
  6. even if, distantly, on some cloud somewhere.
  7. This is very different than systems of payment
  8. that have existed for thousands of years,
  9. and will likely lead to the next evolution
  10. of not only how we pay, but also our perception of money.
  11. And actually, on that note,
  12. kind of an interesting vignette or an example,
  13. is even in North Korea,
  14. traditionally a country that we think is cut off
  15. from the global financial system,
  16. increasingly many people in North Korea now use cell phones.
  17. And frequently they have top up,
  18. like in other countries in the world, to buy credits.
  19. But now those credits can be transferred
  20. from one user of a cell phone to another
  21. as a form of payment.
  22. And so this is also a different concept of money.
  23. – Now this reminds me about one of my favourite things
  24. about living in Hong Kong, the Octopus Card.
  25. With this little card, I can pay for just about anything.
  26. Transportation, food, and even government services.
  27. And you can see it’s pretty worn here,
  28. I’ve used this card practically every day
  29. since my family moved here in 2007.
  30. And my children almost exclusively use their card
  31. to make purchases.
  32. – While some may say that the shift
  33. to making payments from phone apps or the Octopus Card
  34. is just the next natural iteration in making payments,
  35. much like the credit card
  36. and the handwritten cheque before that,
  37. and that people just get used to
  38. the changes in the way that,
  39. children in Hong Kong are already used to paying
  40. with an octopus card for when they go to the store.
  41. We do need to understand that such changes
  42. can actually have very significant implications for society.
  43. – Now for example, there are personal implications.
  44. As we know, these developments can make it easier
  45. to access and use money,
  46. having your credit card means
  47. you don’t have to carry around thousands of dollars in cash
  48. to make a purchase, for example.
  49. But on the other hand, studies seem to indicate
  50. that people spend more money when using a credit card
  51. than when using cash.
  52. And there’s reason to believe
  53. that people spend even more money
  54. when using an app or a web-based service
  55. than when using a credit card.
  56. It’s believed that the proximity issue
  57. that we discussed previously has a lot to do with this.
  58. This is pretty common sense, right,
  59. holding cash is proximate, and therefore forces us
  60. to think about the work that went into earning the money.
  61. Thus we naturally spend less when we’re holding cash.
  62. – But beyond these personal implications,
  63. as mentioned before,
  64. there are broader societal implications to consider.
  65. Now going back to David Bishop’s experience
  66. with the street vendor in China.
  67. This was a great example of disruption
  68. of the many formidable institutions
  69. that for millennia have controlled not only finance,
  70. but most other aspects of institutional power.
  71. Remember, there were no banks,
  72. whether physical or virtual, in this scenario.
  73. WeChat, called Weixin in China, isn’t a bank
  74. or even a financial institution in the traditional sense.
  75. That’s why refer to them as TechFins:
  76. large technology companies
  77. that because of their size, user base and overall scale,
  78. are starting to move into areas of commerce
  79. and services traditionally controlled by banks.
  80. – But not only were there no banks,
  81. there was also no physical currency.
  82. As I’m sure you’re aware,
  83. banknotes and coins can only be produced
  84. by government-approved organisations.
  85. For example, US dollars are printed
  86. by the Bureau of Engraving and Printing,
  87. and are issued by the Federal Reserve.
  88. And here in Hong Kong, our banknotes
  89. are printed by the Hong Kong Printing Limited,
  90. and issued by three banks.
  91. So here I have three $100 notes.
  92. This one issued by the Bank of China Hong Kong Limited.
  93. This by the Hong Kong and Shanghai
  94. Banking Corporation Limited, more commonly known as HSBC.
  95. And finally this one, issued by Standard Charter Bank.
  96. – Okay, so who cares?
  97. You’ve all probably held and seen foreign currency
  98. at some point in your life,
  99. and know how it can be a pain to exchange currency
  100. when going from country to country.
  101. You’ve also probably had the experience
  102. of shopping in another country,
  103. and trying to convert the cost of something
  104. from one currency to another.
  105. Now, to be honest, although I’ve lived in Hong Kong
  106. for nearly ten years,
  107. I still find myself frequently converting
  108. the price of a good into US dollars
  109. just to help me get a better sense of the cost or value
  110. of that particular item.
  111. Well the reason this matters
  112. is a combination of trust and power.
  113. As we outlined in Module One,
  114. the value of currency is really only sustained
  115. by a broad sense of communal trust.
  116. And by changing the nature of money,
  117. we are potentially altering the foundations of trust,
  118. which can have broad implications across society.
  119. – But also, this is about power.
  120. The ability to print your own currency
  121. is a significant source of power.
  122. Maybe you’ve seen a movie where criminals
  123. try to steal or create ink plates
  124. so that they can print their own money,
  125. and to be honest, when I was a kid,
  126. that was a dream of mine.
  127. As another example, there’s been a lotta discussion
  128. about the power the United States has in the world
  129. because of the outsize influence of the US dollar.
  130. Which is widely considered as the world’s reserve currency.
  131. As an example, right here in Hong Kong
  132. our money is pegged to the US dollar.
  133. Meaning the value of the Hong Kong dollar
  134. rises and falls along with the US dollar.
  135. So think about how much power that involves.
  136. It means that some folks in the US
  137. that maybe have never even been to Hong Kong,
  138. can change the value of our currency here,
  139. which in turn can affect the cost of everyday goods,
  140. housing prices, the value of your personal savings,
  141. company profitability and many many other things.

Additional Readings

  • Hardekopf, B. (2018). Do People Really Spend More With Credit Cards? Forbes. Retrieved from https://www.forbes.com/sites/billhardekopf/2018/07/16/do-people-really-spend-more-with-credit-cards/#79b01d11c19a 
  • Prelec, D. & Simester, D. (2001). Marketing Letters, 12(5). Retrieved from https://doi.org/10.1023/A:1008196717017 (paywall)

5.1.3 Will Governments Accept New Currencies?

  1. – Okay, so returning to street vendor example again.
  2. On that street in China there’s no currency involved,
  3. now is that the goal of FinTech Innovation
  4. to eliminate all physical currency?
  5. Or perhaps even government backed currency altogether?
  6. If the latter, do you think that governments
  7. will just roll over and allow
  8. their control of their currency to be taken away?
  9. – Now in the China street vendor example
  10. although people were paying with the Web App
  11. the transactions were still backed by government
  12. issued currency.
  13. What happens when this is not the case?
  14. Cryptocurrencies like Bitcoin are usually not
  15. government issued, and even government backed currency.
  16. Will governments be willing to allow
  17. the use of theses cryptocurrencies
  18. within their borders?
  19. And possibly even adopt
  20. one of these currencies as one of their own?
  21. – Some nations have already announced
  22. that they would move to cryptocurrency in some form.
  23. For example Venezuela launched
  24. a Petro cryptocurrency to help the country
  25. among international economic sanctions.
  26. But the Marshall Islands is the first country
  27. to launch a legal tender cryptocurrency
  28. meaning their new currency will be
  29. recognised as legal tender, real money.
  30. And will have equal status as their
  31. current currency, you’ve guessed it
  32. which is the US dollar.
  33. We told you, the US dollar, lots of power.
  34. – Even the name of the new Marshallese currency
  35. it’s called ‘Sovereign’ after all,
  36. is a statement about power.
  37. The name was chosen to emphasise
  38. the sovereignty of the country
  39. which has a history of colonisation, nuclear testing
  40. and resulting poverty.
  41. When discussing the controversial cryptocurrency
  42. the president said “This is a historic moment
  43. for our people, finally issuing and using
  44. our own currency, alongside the
  45. US Dollar. It is another step
  46. manifesting our national liberty”.
  47. – This switch will have a lot of implications,
  48. and could be the start
  49. of a new age for money and finance.
  50. Interestingly, the International Monetary Fund, the IMF
  51. warned the Marshall Islands government
  52. about issuing such a cryptocurrency.
  53. They were concerned that the currency
  54. could be manipulated by crime syndicates
  55. and fraudulent business practises.
  56. The types of activities that have often
  57. been tied to cryptocurrencies.
  58. And also that foreign government could
  59. cut financial aid to the Marshall Islands
  60. if they broke from the US Dollar
  61. as their own e-currency.
  62. Okay so let’s stop and consider some important questions.
  63. Do you think that cryptocurrency and other FinTech
  64. solutions will ever be largely adopted
  65. by banks and governments?
  66. Or, will they lead to de-centralised future
  67. where banks and governments are less
  68. influential in these areas?
  69. Will TechFins take over the finance industry?
  70. And, will other countries adopt cryptocurrency
  71. as a legal tender?

Additional Readings

  • Venezuela Starts Pre-Sale of Petro Cryptocurrency (2018). Deutsche Welle. Retrieved from https://www.dw.com/en/venezuela-starts-pre-sale-of-petro-cryptocurrency/a-42655274
  • Sovereign cryptocurrency: Marshall Islands to launch world-first digital legal tender (2018). Deutsche Welle. Retrieved from https://www.dw.com/en/sovereign-cryptocurrency-marshall-islands-to-launch-world-first-digital-legal-tender/a-42810832
  • Young. J. (2018). Why the IMF is Trying to Stop Marshall Islands From Adopting Crypto. CCN. Retrieved from https://www.ccn.com/why-the-imf-is-trying-to-stop-marshall-islands-from-adopting-crypto

5.1.4 Will FinTech Take Control of Financial System?

  1. – Okay so in summary,
  2. FinTech is leading to some really amazing efficiencies
  3. that can help a lot of people
  4. bypass middlemen and save money,
  5. but it can also lead to exclusion within countries
  6. and exacerbate divides between countries.
  7. This is largely because these innovations
  8. are completely changing the very concept of money
  9. which is leading to questions
  10. about trust, proximity and especially power.
  11. – So where does this leave us?
  12. Will banks allow their power
  13. to be receded by decentralised cryptocurrencies,
  14. peer-to-peer lending networks
  15. and other FinTech innovations?
  16. Or will they use these developments
  17. to further consolidate their power over financial products?
  18. – Will governments stand idly by
  19. while their power over currency,
  20. personal identification,
  21. and other traditional government-based power
  22. is taken away by FinTech startups?
  23. – And how will both banks and governments
  24. react to the rise of TechFins,
  25. who seem to be growing daily,
  26. increasing in both power and profits
  27. as they expand further and further into services
  28. traditionally handled by other institutions?
  29. In this module we will explore some of these questions.
  30. But first, let’s talk about what we mean
  31. when we say decentralised,
  32. or democratised.

Additional Readings

  • Zetzsche, D. A., Buckley, R. P., Arner, D. W., & Barberis, J. N. (2017), From FinTech to TechFin: The Regulatory Challenges of Data-Driven Finance. University of Hong Kong Faculty of Law Research Paper No. 2017/007. Retrieved from or http://dx.doi.org/10.2139/ssrn.2959925
  • Marous, J. (2018). The Future of Banking: Fintech or Techfin? Forbes. Retrieved from https://www.forbes.com/sites/jimmarous/2018/08/27/future-of-banking-fintech-or-techfin-technology/#5bbdbbd15f2d
  • Ren, D. (2018). Tightening Regulations Make FinTechs Easy Takeover Targets for Banks Stepping Up Digitalisation Drive. SCMP. Retrieved from https://www.scmp.com/business/companies/article/2159718/tightening-regulations-make-fintechs-easy-takeover-targets-banks

5.2.1 Is FinTech an Evolution or a Revolution?

  1. – Now, if you pay attention to the FinTech space
  2. chances are that you’ve heard words
  3. decentralised and democratised a lot.
  4. FinTech experts of all varieties
  5. love throwing around these terms
  6. but what do they really mean in a FinTech context?
  7. – Well, many see FinTech development
  8. as a natural process of technical advancement,
  9. much like the locomotive surpassing the stagecoach.
  10. Others see FinTech as a direct result of,
  11. and possibly even fighting against,
  12. traditional centres of financial power.
  13. Some believe the power to control banking, currency,
  14. and even our own identity, has been held by the elite few
  15. and that the control of finance
  16. has not been transparent nor democratic.
  17. – Whether as a result of the natural evolution of technology
  18. or as a direct backlash against existing power structures,
  19. the reality is that FinTech is seen by many
  20. as having the potential to completely change,
  21. and possibly even destroy,
  22. existing financial power structures.
  23. And it’s important to understand
  24. both these motivations and possible outcomes.
  25. So do you think that FinTech advancements
  26. are a natural evolution of technology
  27. or a direct result from the mistrust of institutional power?

Additional Readings

  • Wilkins, C. A. (2016). Fintech and the Financial Ecosystem: Evolution or Revolution? Bank of Canada. Retrieved from https://www.bankofcanada.ca/2016/06/fintech-financial-ecosystem-evolution-revolution/

5.2.2 Have We Lost Trust in Financial Institutions?

  1. – As we have discussed many times,
  2. finance is largely built on trust and in the past,
  3. institutions likes banks and governments have served
  4. as the guarantors of trust in the financial world.
  5. But whether as a cause or effect,
  6. trust in institutions has diminished significantly
  7. in many countries over the past decade.
  8. – Now, probably the best recent example
  9. of a cause for distrust in the financial world
  10. is the global financial crisis,
  11. including all the major financial scandals
  12. that were exposed as a result of the crisis.
  13. Now, for most of us,
  14. we had to stand by and powerlessly watch
  15. as the global financial system nearly collapsed.
  16. We had a daily reminder of how flippantly
  17. certain members of the global financial community
  18. pursued profits at the expense of their customers
  19. and how government regulators
  20. were not really sufficiently protecting us.
  21. – Millions of people around the world lost their homes,
  22. their savings and essentially their futures.
  23. In the US alone, it is estimated that American households
  24. lost approximately $20 trillion
  25. in wealth as a result of the financial crisis.
  26. And as a result, it is no surprise that many of these people
  27. began to de-trust the very institutions
  28. that were meant to protect and serve them.
  29. – Now, as a personal example, my wife and I
  30. bought our first house the year I graduated law school
  31. which was around 2005.
  32. Obviously we didn’t know that we were buying
  33. at pretty much the worst time possible
  34. with the financial crisis decimating the real estate market
  35. only two years after we purchased our home.
  36. Now, when the market crashed,
  37. the value of our home dropped by over 30%
  38. and it took a really long time to recover.
  39. Well, my family recently sold our home.
  40. It was about 13 years after we purchased it.
  41. The selling price?
  42. Exactly the same amount
  43. that we purchased it for back in 2005.
  44. So, while we’re grateful
  45. that we didn’t really lose any money.
  46. There was a lost decade were many people around the world
  47. lost most of their net worth
  48. and have struggled to recover ever since.
  49. – Let’s be honest, many large financial institutions
  50. have not done much since the financial crisis
  51. to reduce our concerns.
  52. As noted earlier in this course,
  53. banks such as Wells Fargo and HSBC
  54. have had multiple high profile scandals
  55. that have gutted their customers’ trust.
  56. And it seems, every week there’s some sort of scandal
  57. that comes out that involves the financial institutions.

Additional Readings

  • The Financial Crisis Response In Charts (2012). The Department of The Treasury. Retrieved from https://www.treasury.gov/resource-center/data-chart-center/Documents/20120413_FinancialCrisisResponse.pdf
  • Srivastava, S. (2017). Nine Years On From the Financial Crisis, Banks Are Still Working to Rebuild Trust. CNBC. Retrieved from https://www.cnbc.com/2017/09/19/nine-years-on-from-the-financial-crisis-banks-are-still-working-to-rebuild-trust.html

5.2.3 Can We Trust TechFins?

  1. – Now during the past decade,
  2. as people shunned banks
  3. and traditional holders of power,
  4. they turned instead to the so-called TechFins,
  5. digital platforms like Facebook, Amazon,
  6. Google, and Tencent,
  7. that provide eCommerce, peer-to-peer lending,
  8. communications, and increasingly serve
  9. as the keepers of our digital identity.
  10. – The rise of the gig economy of social media sites
  11. has meant customers now have more control
  12. over nearly every consumer service,
  13. whether hailing a taxi,
  14. deciding where to stay while on holiday,
  15. and even how to pay their bills.
  16. These large digital platforms have transcended
  17. many traditional financial institutions,
  18. not only in terms of customer engagement,
  19. but also in terms of trust.
  20. – But after more than a decade of explosive growth,
  21. the TechFins are themselves now caught up in many scandals
  22. and are seeing their own trustworthiness questioned.
  23. And some contend
  24. that these companies are now so large and powerful
  25. that they’re actually influencing government policy
  26. and even national elections.
  27. So got a question for you.
  28. Do you trust the TechFins like Amazon and Tencent
  29. more than you trust banks?
  30. Do you think that the TechFins should be regulated
  31. like a utility?
  32. And what do you think
  33. about companies like Facebook
  34. entering the crypto payment space?

Additional Readings

  • Tasca, P. (2018). The Hope and Betrayal of Blockchain. The New York Times. Retrieved from https://www.nytimes.com/2018/12/04/opinion/blockchain-bitcoin-technology-revolution.html
  • De Vynck, G., & McLaughlin, D. (2019). Big Tech Is Armed and Waiting to Repel U.S. Antitrust Onslaught. Bloomberg. Retrieved from https://www.bloomberg.com/news/articles/2019-06-06/big-tech-is-armed-and-waiting-to-repel-u-s-antitrust-onslaught
  • Smith, U., & Smit, L. (2019). Zuckerberg is Right: Third-Party Standards Must Govern Online Speech. VentureBeat. Retrieved from https://venturebeat.com/2019/04/06/zuckerberg-is-right-third-party-standards-must-govern-online-speech/
  • Heskett, J. (2019). What’s the Antidote to Surveillance Capitalism? Harvard Business School Working Knowledge. Retrieved from