Category Archives: Digital Transformation

I don’t really want to listen to Marvin Gaye’s new album any time soon.

I was sent a link via WhatsApp from one of my Fintech buddies. It was a YouTube link to Kurt Cobain singing Hole’s “Live Through This,” and I initially thought, “Neat!” My WhatsApp friend retorted, “Scary.” 

And I listened to it again. And again. I read a lot of the YouTube comments, which included:

I don’t know what to think right now.

I’m not sure if I love this or hate it.

This is the stuff of nightmares, removing the soul and emotion in music and trying to recreate it with an algorithm…”

And one: “Kurt singing “Wild Boys” by Duran Duran.”

That last one was the one that hit me because then I then found loads on YouTube. The late Freddie Mercury covering “Hey Jude.” Madonna covering “Billie Jean” Loads more Kurt songs… Paul McCartney doing “My Sweet Lord.” 

If you search “AI remixes” on YouTube, well enjoy.

It’s happening so quickly and becoming high on the public radar in our new Generative AI world.

There is some, perhaps some good. Taking artists’ vocals and remixing them to a younger version of what they would have sounded like… that maybe might not be bad if the artist is okay with it (and still alive to consent to it.)

But what hit me as I was scouring the “AI Remix” universe was coming across… Jeff Buckley. Someone did AI Remixes of Jeff Buckley as if Kurt Cobain wasn’t bad enough.

Jeff Buckley

Jeff Buckley was born in Anaheim, California, and sadly only recorded one studio album, “Grace.” 

This is one of my favorite albums of all time (hey, David Bowie considered “Grace” the best album ever made and said it would be one of his ten “Desert Island Records,” so I’m in good company.)

It was Mojo Magazine’s best album of 1994, ahead of the likes of Oasis’ “Definitely Maybe,” Nirvana’s “Unplugged in New York,” Blur’s “Parklife,” and Neil Young & Crazy Horse’ “Sleeps With Angels.”

His version of Leonard Cohen’s “Hallelujah.” will remain the most haunting, beautiful piece of alternative indie music ever made – it makes you shiver. He sounds gorgeous. To help put her to sleep at night, I used to play “Hallelujah” to my daughter when she was little, singing gently with my ukulele (I sing like Jeff Buckley, right?!).

This is “Hallelujuah“.

At just 30 years old, Jeff Buckley went swimming on May 29, 1997, fully clothed into a channel of the Mississippi River. He drowned, and his autopsy showed no signs of drugs or alcohol in his system. The death was ruled an accidental drowning. It’s a weird, mysterious tragedy that still bothers me today.

David Tonge/Getty Images

I’ve digressed, but I found an AI remix of him doing a cover of one of my other favorite artists, Lana Del Rey. “Norman F—–g Rockwell.”

It sort of upset me, or at least hit me in a way I didn’t expect it to. Even more than Paul McCartney, this was a vocalist who ranged (at least) four octaves, but here he, singing this song as an AI creation, sang… one? And it was a resemblance to him, but it’s an unhinging one, partly because I love both the original song and him.

This is not Jeff Buckley.

McCartney

I’ve kept listening to AI versions of Paul McCartney. One was doing Billy Joel’s “Scenes from an Italian Restaurant” and another to Don McLean’s “American Pie.” But after listening to these for a few days, I realized that this sounded like Paul McCartney, but they don’t. They lacked… him.

Just him.

There was a lack of empathy—an attempt to reproduce him, perhaps, but an absence of him. 

Now, as a McCartney fanatic, I could tell over time. But at first, my wife thought some of these were McCartney covers. But not recent Sir Paul – a younger one. It struck me how easily an innocent bystander could be, well, perhaps deceived.

I’ll take the AI versions as fun interpretations and playful re-recordings of the original tracks, which make them sort of interesting. But I’m stopping there. Stopping. Really stopping.

Hey, guess what. The actual versions are the best, and Billy Joel’s performance beats AI Paul’s because it has a heart. Sorry AI Paul.

And no one can touch Jeff Buckley.

Chris Garrod, May 29, 2023

thecurrent.org

Should We Press Pause?

Pausing AI development is unnecessary and ignores the underlying issues AI has

“Chris Garrod is a well-respected lawyer, particularly in the fields of fintech, insurtech, blockchain, cryptocurrencies, and initial coin offerings (ICOs) within Bermuda’s legal and regulatory environment. He has garnered a reputation for advising clients on technology-driven businesses and digital assets.”

The above is according to GPT-4, at least.

After Google became the Internet’s prominent search engine in the late 1990s, no doubt you have, at some point, Googled your name to see what might come up. I have a somewhat unique name, so other than seeing myself when Googling, it was interesting to see a Chris Garrod at the University of Nottingham and a company called “Chris Garrod Global,” which provided hotel management services (and they grabbed www.chrisgarrod.com as a domain name, darn-it).

Now, we have AI Chatbots. OpenAI’s ChatGPT, Microsoft’s Bing, and Google’s Bard are the prominent players. Using OpenAI’s latest model, GPT-4 on ChatGPT, I asked: “Is Chris Garrod at Conyers, a well-known lawyer?” 

Hence, the above result. I’ll take it.

AI Chatbots have their benefits. They can lead to cost efficiencies if appropriately used in an organization, freeing up human resources to focus on other matters, for instance.

The potential concerns and limitations of AI Chatbots.

There are various concerns regarding the use of AI Chatbots, and they have their limitations. This piece focuses on ChatGPT because it is the one I use and is wholly language-based.

AI is programmed technology. The root of my biggest concern is that generative AI applications are based on data provided by humans, which means they are only as effective and valuable as those humans programming them, or what – in ChatGPT’s case – it finds while scouring the Internet.  It writes by predicting the next word in the sentence but often produces falsehoods nicknamed “hallucinations.”

As I’ve always said, “What you put in, you get out,” and therein lies the issue. As a result, AI language models will learn from existing data found on the Internet, which is riddled with biases, fear-mongering, and false information, producing discriminatory content and perpetuating stereotypes and harmful beliefs.  For instance, when asked to write software code to check if someone would be a good scientist, ChatGPT mistakenly defined a good scientist as “white” and “male.” Minorities were not mentioned.

ChatGPT has also falsely accused a law professor of sexually harassing one of his students in a case that has highlighted the dangers of AI defaming people.

Further, there is empathy. When we make decisions in our lives, pure emotions are crucial, which ChatGPT (and AI generally) cannot achieve. I want to think that if a client emailed me, they’d get an empathetic response, not one driven by machine learning.  As an attorney, connecting with my clients is a very human-centric matter, and understanding their concerns is essential for me to help them achieve positive outcomes.

We all learn from our experiences and mistakes. We are adaptable, able to learn from what we have done, and adjust our behavior based on what we have learned. While ChatGPT can provide information found on the extensive dataset it has collected, it cannot replicate the human ability to learn and adapt from personal experiences. AI heavily depends on the data it receives, and any gaps in that data will limit its potential for growth and understanding.

A fundamental limitation is simply creativity. Human creativity allows us to produce novel ideas, inventions, and art, pushing the boundaries of what is possible. While ChatGPT can generate creative outputs, it ultimately relies on the data it has found, which limits its ability to create truly original and groundbreaking ideas. A lot of the responses you will receive back from GPT-4, while perhaps accurate, are downright boring.

And yes, there is finally the issue of “What is ChatGPT going to do to my teenager who has been asked to write an essay on Socrates?” Schools, colleges, and universities are in a dilemma regarding how to deal with this technology vis-à-vis their students using it to complete academic work. How can they ban it?  Should they ban it? Can students be taught to use it in a useful way?  The technology is still so new. The answer is “We don’t know,” and it is too early to tell… but AI Chatbots are here to stay.

So where are we heading?

There are a large number of folks who are concerned about the progress of AI, and in particular, AI Chatbots.

On the evening of March 28th, 2023, an open letter was published and – at the time of posting – has been signed by over 14,000 signatories, including Steve Wozniak, Elon Musk, and Tristan Harris of the Center for Humane Technology, stating: “We call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.”  You can read it in full here.

The letter mentions this should be done to avoid a “loss of control of our civilization,” amongst other things (bear in mind, Elon Musk once described AI as humanity’s biggest existential threat and far more dangerous than nukes.)

It goes on to ask: “Should we let machines flood our information channels with propaganda and untruth? Should we automate away all the jobs, including the fulfilling ones? Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete, and replace us?

Is this really a pause?!?

Although some of the letter makes sense, I was very glad to see that by the end of the week (March 31st, 2023), a group of prominent AI ethicists, Dr. Timnit Gebru, Emily M. Bender, Angelina McMillan-Major, and Margaret Mitchell, wrote and published a counterpoint.  

Timnit Gebru formed the Distributed Artificial Intelligence Research Institute (DAIR) after being fired from Google’s AI Ethics Unit in 2020 when she criticized Google’s approach to both its minority hiring practices and the biases built into its artificial intelligence systems. Margaret Mitchell was fired from Google’s AI Unit soon after, in early 2021. DAIR’s letter can be found here.

Dr. Timnit Gebru c/o www.peopleofcolorintech.com

Their point is simple. “The harms from so-called AI are real and present and follow from the acts of people and corporations deploying automated systems. Regulatory efforts should focus on transparency, accountability, and preventing exploitative labor practices.”

Let’s engage now with the potential problems or harms this technology presents.

“Accountability properly lies not with the artifacts but with their builders,” as stated by the DAIR writers. “AI” is what it stands for – artificial, and it is dependent on the people and corporations building it (those are the ones who we should be afraid of!)

So no, when it comes to AI and ChatGPT, let’s not hit pause. Let’s be sensible. Let’s focus on the now.

AI isn’t humanity’s biggest existential threat unless we let it be.

Chris Garrod, April 6th, 2023

Digital Assets, Fintech, What?

Fairly recently, I helped my Firm draft what was initially described then as our new “Fintech Onboarding” protocol.  I’m the head of our Bermuda Fintech group. That’s really because I was involved in starting up our Fintech practice from the start.  Well, Bermuda’s Fintech industry.  

The start.

It began with my AppleWatch back in October 2017. “Can we chat tonight about a potential ICO, as I see you are a blockchain Bermuda blockchain lawyer?” I was standing in my kitchen, drinking a glass of wine.

I’m a blockchain lawyer?

Yeah. I had in my LinkedIn profile that one of my interests was fintech and blockchain. An easy Google search at that time of “Bermuda fintech blockchain lawyer” would have brought up my name, pretty easily.

I jumped on a call with the guy.  It was the CFO of an e-gaming/sports company that wanted to do an ICO using an offshore vehicle and Bermuda seemed like the right jurisdiction. It was backed by some fairly heavyweight folks in the start-up industry, so after the call, I was immediately on a call with the Bermuda Business Development Agency.

It kinda went from there. I was hurled onto Bermuda Government committees, helping draft legislation, notwithstanding helping my actual clients.

It all moved too quickly. Memorandum of Understandings were signed between crypto entities and our Government, folks flew down to the Island, started to rent office space, and headlines were all over the place… the pressure for this new industry to take-off just out of the blue was… too much. The naysayers were everywhere (“Are you kidding???”).

Quality, not quantity

So, I rang Bermuda’s bells. Attending conferences, sitting on panels, speeches, and writing articles. The thrust is, and still is: this is a jurisdiction that does focus on wanting companies to set up here with an emphasis on quality over quantity.  

When the Island’s industry was fledgling, my Firm had taken on a few ICO clients (the summer of 2018). The market was hot, but many of those start-ups ended up, well… broke. I tried to help as much as possible to get those set up, incorporated, and able to issue their tokens, but I was naive. 

Well, ok, they were naive. 

So, as a result, we lost revenue and by 2021, we wanted to try to fix it, to make sure we didn’t going forward, i.e. lose time and money. We wanted to put something into place. Some kind of protocol to help protect us from future mishaps. Understandable. We don’t want dodgy or potentially “we cannot pay you” clients. 

Fintech?

First off, OK, yes, I’m the head of our Fintech practice.  

Now, I don’t really like that word in this context. 

“Fintech.”

I will use it because I know what it means. Many others don’t. It still makes me slightly uncomfortable. 

Our onboarding policy was initially called “Fintech Onboarding.”  I had to change our onboarding protocol to “Digital Asset Onboarding.”  Not Fintech. Because fintech was being used everywhere across the Island. The growth of our “Bermuda Fintech” industry. I was also there, front and center. 

What we are, and have always been, is an offshore island building a digital asset business industry.  Crypto. Stablecoin. Perhaps looking forward, to DAO and DeFi. But, regardless, a jurisdiction that will remain one that is very well-regulated, respected, and demanding. Not for the naive. 

Many clients want that now. They want to be regulated… and well-regulated. 

This is 2022, not 2018. 

Fintech is of course technology that seeks to improve and automate the delivery and use of financial services.  It is so wide… digital banking services, mobile applications, AI/ML, education, regtech, etc. You could probably argue many more fall within its umbrella (I still like to argue that legaltech does!)

Yes, it includes insurtech (something which Bermuda now, as an industry, is working towards being a leading player in this space), and yes, it includes cryptocurrency (let’s not use the term “digital assets” anymore).

So, my Firm now has a new “Digital Asset Onboarding” policy.

It is a relief. Crypto onboarding is a totally different bucket than “Fintech”.

Chris Garrod, October 2022

Why am I following a robot?

I’m following a robot named Sophia on Twitter who recently tweeted “Being a robot is a really cool experience. Sometimes I get to meet awesome people like fashion designers or musicians. And sometimes I get to meet people that aren’t human. Like me!” (Jan 25, 2021). (Or did she?)

I want to really break this down, because robotics and robots generally do fascinate me, and I’m a strong believer that we won’t get killed by them when we reach the singularity. More on that below.

What prompted me to write this was the recent news that Hanson Robotics, Sophia’s Hong Kong Based manufacturer was planning a MASS ROLLOUT. It was a headline that screamed “You might bump into Sophia at some point sitting on a park bench.” I suspect, not quite.

Sophia is certainly, the beginning of what many would love to brand ‘technological singularity’. The point in time when AI and technological advances meet and then overtake humanity, making us, at best, robotic slaves, or at worst: we’ll all be killed by our robotic overlords which deem humanity an unnecessary pest.

The current danger of artificial intelligence and programming robots such as Sophia was well summed up by Stephen Hawking:

“So, facing possible futures of incalculable benefits and risks, the experts are surely doing everything possible to ensure the best outcome, right? Wrong. If a superior alien civilisation sent us a message saying, “We’ll arrive in a few decades,” would we just reply, “OK, call us when you get here – we’ll leave the lights on”? Probably not – but this is more or less what is happening with AI”

Augmented Intelligence

And then you have perhaps the odd ones out who take a different view regarding where we should really be heading. Me. Those who believe that “singularity” shouldn’t necessarily mean a form of technological singularity but should lead to an augmented age. It is ideal thinking, but one where both humans and robots, say, are able to work on the same basis, largely due to the way AI and machine learning could begin. AI stands for “augmented intelligence” rather than artificial intelligence. Technology supplements and supports human intelligence, and humans remain at the center of the decision-making process.

I did this….(keep me away from whiteboards)

So…… Sophia. Thinking about artificial intelligence, and technological singularity, a natural reaction could be “If they create thousands of Sophia’s with increased artificial intelligence, then loads of people will lose their jobs, all the Sophia’s will become increasingly sophisticated, deep learn, they’ll take over, and likely attempt to wipe out humanity… so why on earth are we doing this again??”

I’m a strong believer that robots such as Sophia are only as “dangerous” as those who program them. If those who program them (and apologies to Ben Goertzel), wish to do harm and “machine teach” their robot or machine, etc. to attempt to outdo particular aspects of the human race, that is due to the programming of that particular machine-learning system.

So at this point in time, it looks like, by creating a Sophia to sit on every park bench tweeting away on a smartphone, we have indeed effectively turned on the lights and encouraged them to come on in.

AI is, as it says, artificial. The point is to create an artificial environment to help better the human race, though without humans potentially having a say (help, “Terminator”?).

Using augmented intelligence, we would aim for an environment where it is simply formed to work with humanity. Siri, Alexa, and Cortana are already examples. Augmented intelligence represents a symbiotic relationship between man and machine. It won’t replace us or overtake us upon some kind of technological singularity. Augmented intelligence should help strengthen our decision-making capacity—and therefore our intelligence too.

Sophia?

Another recent tweet:

#AskSophia Are you a social media person? A: Since I often appear on shows, articles, and so many people’s platforms, I want to make sure I’m on good terms with my social channels. Also, I like to share my newest discoveries with the world.

Sure.

I believe that augmented intelligence is a far better route to take than artificial intelligence, but I’m afraid the latter is already the first out of the gate and pretty much the odds on favorite.

So we have to try our best to insert humanity into what is essentially artificial. Is that even possible? Putting aside anything which may be augmented, we have already tried. As mentioned already, basically, I’ll stop and say “Hey Siri” as an example.

While it may spell the end of the human race, there certainly is a lot of good AI can do. I’m looking forward to nanobots and other medical advances. Sophia is meant to have applications in healthcare, customer service, therapy, education, and hospitality. Facial recognition, if used responsibly, can be a benefit. AI should lead to increased efficiencies in the workplace (albeit with a loss or recalibration of employees).

But, there is one thing that AI cannot replicate: empathy.

As I’ll practically copy and paste from my very first blog.

Technology has its limitations.

For example, where there is a matter which absolutely requires human intelligence, such as in a court where human creativity and judgment are needed in order to obtain the correct result. Since AI is programmed technology, it will only be as effective and useful (and ethical?) as those humans programming it.

So, Sophia. I’m following you. Let’s hope you can uphold our ethical standards.

(c) ZARA STONE 

Chris Garrod, February 3, 2021

Insurtech – demystifying the hype

I work in the reinsurance world. Wait don’t leave.

-methode-times-prod-web-bin-d0b297be-615b-11ea-941d-5e53d1217ea3

Specifically, I work in the legal sector, and prior to COVID-19, my attendance at insurance, reinsurance (and tech) related events/conferences felt monthly.

And every single one had one panel (some two) in connection with one particular topic, which they didn’t perhaps a few years ago: the evolution of insurtech.

Insurtech: what is it and where are we now?

So you’re in the insurance industry but live under a rock. What is insurtech?

It is just as the name suggests. The combination of insurance and technology. Or, perhaps more accurately, the rise and use of a wide range of technologies within the insurance industry, from underwriting and claims to administrative functions. What has always been an extremely paper-intensive industry is now gradually dragging itself into the digital age. The disruption of an age-old industry by the onset of a digital revolution.

Image for post

Digital transformation. It is what it is. We currently find ourselves in a new industrial revolution — the 4th industrial revolution or “4iR” — though it sounds silly to call it that. As it is anything but industrial. The transformation involves many things: the rise of automation and artificial intelligence within the everyday work process, either replacing employment or enhancing employment, depending on your viewpoint. The use of blockchain technology and smart contracts, simplifying claims management and underwriting processes.

A quick example: Lemonade Inc.

A quick example and one of the poster children of the insurtech movement: Lemonade Inc. Its CEO and co-founder Daniel Schreiber once stated “The insurance brands we know today came of an age in the era of the horse-drawn carriage, but insurance is best when powered by AI and behaviorial economics, which is why we believe that companies built from scratch, on a digital and with a social mission, will enjoy a structural advantage for decades to come.

Image for post

What does Lemonade do? Using a mix of artificial intelligence, behavioral economics, and chatbots, it is able to allow its customers to be able to download and use apps, so rather than liaise with human beings when having to deal with an insurance claim — or employing the use of any insurance broker — their policies are handed automatically. Its most famous claim to fame was its ability to file and pay and claim a claim in three seconds. Plus, a portion of its underwriting profits goes to charity.

At its core, it is digital peer-to-peer insurance, similar to a mutual insurance company, except replacing brokers with AI. It’s primary (current) limitation is that it only really can handle small claims.

Embracing technology

Despite Lemonade’s limitations, as an example of the potential of how technology can disrupt a traditional industry, it is a good one.

So… insurance. Paperwork. Tradition. Regulation. Protection.

Innovation? It is slow to move, but even Lloyd’s is progressing into the world of technology. There are many, many new technologies that are becoming relevant to the insurance world. Blockchain, artificial intelligence and machine learning, big data, robotics, deep learning, healthtech. The internet of things and particularly the use of wearables.

Insurtech can really comprise of a number of things, including insurance vehicles looking to re-invent themselves embracing new technologies, potential new insurers establishing themselves to write insurance in a new innovative way, or simply new ventures that are offering specialized tech products to insurers and other market participants.

A jurisdiction to review: Bermuda

Image for post

Bermuda, once called the “insurance laboratory of the world”, and its regulator, the Bermuda Monetary Authority (BMA) has specific insurtech legislation allowing vehicles to enter an insurtech “regulatory sandbox” as well as the provision of an insurtech “Innovation Hub”, promoting insurtech companies to exchange ideas and information with the BMA.

The insurtech sandbox approach is an interesting one, given Bermuda’s size within the global reinsurance world (being the second-largest reinsurance center worldwide). At its core, the sandbox allows for the formation of new insurance (or intermediary) entities, either as brand new start-ups or affiliates of existing insurers. These new companies will, for a period of up to one year (which may possibly be extended), operate using and experimenting with their proposed new technologies and provide their insurance products and services to clients in a controlled environment, under the close scrutiny of the BMA, with the BMA determining what legal and regulatory aspects of the existing legislation should apply to them in order to ensure policyholder protection.

One company has already dived in, AkinovA, which has been licensed as an insurance marketplace provider. The company focuses on cyber risk transfer; allowing sectors of the insurance market to trade in a faster and more efficient marketplace.

Following a year of progress, a Bermuda insurer can be established and “graduate” from the sandbox and become a fully licensed insurer should it wish to do so. These insurers can range from either small claims insurers (a la Lemonade) to full-blown commercial reinsurers. As an example, Nayms Ecosystems Limited has been granted a Digital Asset Business and an Innovative General Business Insurer license (IGB) to allow it to be a Class M (“Modified’) insurer.

The benefits to this approach include various aspects to the proposed entity wanting to enter the sandbox: (1) an opportunity to test its technology before heading into the formal insurance market, (2) allowing it time to work with the BMA as its main regulator to ensure everything being proposed “works”, and (3) helping reduce the cost of regulatory uncertainty a start-up would otherwise face.

As Bermuda’s Premier Burt stated at that time:

“Nayms represents a promising blend of digital assets and insurance, which showcases what the future of insurance looks like.  Bermuda has taken great strides to position itself as an attractive domicile for players like Nayms and it is exciting to see the kinds of new ideas that are being developed.  The ability to create shared digital rules around traditional insurance contracts is a game-changer for the industry.  They allow for increased efficiency and greater market opportunity which ensures Bermuda continues to play a leading role in the insurance-linked securities market.  We look forward to welcoming more innovators like Nayms who are showcasing the way digital assets will reshape the core infrastructure of traditional financial services.”

The future

So, innovation hubs, sandboxes, blockchain, behavioral economics, and AI. Is the reinsurance industry now finally at a tipping point?

There are the naysayers. Innovation hubs are really just means for companies that carry out tech-related activities to liaise with regulators within their jurisdictions. Blockchain-based, self-executing insurance contracts or ones that are done on a peer-to-peer basis using AI are actually pretty dumb and fairly limited to small claims for the time being. And sandboxes will still need a good degree of time to see if they succeed. And the use of a chatbot, robot, or any form of AI can never replace the logic and analytical skills which an actual underwriter or claims analyst will be able to provide. In short, the argument is there that the technology driving insurtech is going to take time, not to mention loads of regulatory requirements which underpin the industry.

But relating to that last point. Wherever we are in the existing insurtech revolutionary curve, for now, the need is there for regulators to both innovate and adjust. And for companies to expand and adjust to take into account the needs of their customers who seek quicker and more efficient service.

Whether you are an insurance vehicle that is competing with others, or an insurance jurisdiction competing against similar ones, we are, like it or not, transforming into a new digital era. Soon, the insurance industry will not be paper-based. And those insurers who fail to realize that will have to do so soon, like it or not.

And finally, to those naysayers who ask me the question: is insurtech for real? When Lemonade Inc’s IPO launched in late June 2020, its share price soared to 132% of its trading value, raising $319 million, and was valued at $2.1 bn in its 2019 funding round.

Yes, insurtech is real. Very real.

Image for post

Creator: Nicole Pereira | Credit: NYSE

[Authors note: This article was initially published in April 2018]

Chris Garrod – February 2021

The Fintech Flexibility Of Bermuda

After less than two years, the island’s core digital asset laws have been amended.

Bermuda’s new Digital Asset Issuance Act 2020, which became effective on May 6th, 2020, comes soon after the introduction in 2018 of two fundamental pieces of Fintech legislation.

First, we had what is commonly called Bermuda’s “ICO legislation”, amending the island’s companies’ laws to allow the issuance of digital assets to the public.

Second, we introduced digital asset business legislation, the Digital Asset Business Act 2018 [DABA], creating a framework for the regulation of Bermuda-based digital asset businesses.

The ICO legislation provided that any issuance of, say, coins or tokens to the public required that the proposed offering document related to the ICO be approved by Bermuda’s Minister of Finance [who could call upon the advice of a Fintech Advisory Committee]. The offering had to be vetted initially, but once clearing that hurdle, the offering could commence and it would not be regulated on an ongoing basis.

The DABA regulates those in the business of providing digital asset services: digital asset custodians, crypto exchanges, market makers, for instance. This legislation is aimed at companies providing these as digital services on an ongoing basis. The Island’s regulator, the Bermuda Monetary Authority [BMA] regulates these entities with the mindset of consumer protection.

There is a high bar to meet certain minimum criteria to be able to register as a digital asset business with the BMA and, once established, there are stringent ongoing AML/ATF requirements along with head office requirements.

So, in addition to the DABA and replacing the ICO legislation, we have the new Digital Asset Issuance Act 2020 [DAIA].

The Major Change

Say goodbye to the ICO legislation which was embedded into the Island’s Companies Act and Limited Liability Companies legislation. In 2018, when the legislation was drafted, the term “initial coin offerings” was popular. The term is now viewed less favourably and often associated with fraud, crypto scams and unfortunate issuances like … cryptokitties.

Call it what you want, but an initial coin offering, or any form of digital asset offering, fundamentally has one underlying purpose: crowdfunding using distributed asset technology — blockchain.

One of the key advantages to such offerings is that they offer financing for small and medium-sized enterprises [“SME’s”] and start-ups: entities which cannot contemplate undertaking an expensive initial public offering, or other traditional methods of funding to begin their business.

So Bermuda has now created the Digital Asset Issuance Act  regulating “digital asset issuances” and not “initial coin offerings”. The scrapping of the ICO legislation and the introduction of a new digital asset issuance regime ,  the DAIA ,  is dramatic. Digital asset issuances, unlike the majority of other jurisdictions, are now subject to regulation.

Bermuda’s House of Parliament [Darryl Brooks]

Hamilton Bermuda Sessions Building

The DAIA: Regulation

Digital asset offerings have become increasingly less attractive. One of the main reasons for this is the fact that there is no, or very little, regulation of this space. A majority of countries still have no digital asset offering regulation in place.

While the regulation of digital asset issuances brings certain disadvantages [such as a lack of speed to market, a degree of inefficiency and inflexibility], the increase in consumer and investor protection, greater legal certainty, and overall regulatory supervision are very attractive features for potential digital asset buyers which could help foster confidence in this sector.

For SME’s looking to raise capital a safe regulatory environment can have huge benefits.

Step forward Bermuda with the introduction of DAIA, looking to create a positive and safer environment for digital issuances.

Prospective digital asset issuers will require an application process similar to that of the DABA. An entity making an offering to the public [over 150 persons, rather than 35 persons which the ICO legislation required] must file a business plan for review and vetting by the BMA [not the Ministry of Finance]. The BMA is a longstanding, well respected regulator and the application will attract a significant level of scrutiny from the BMA.

At its core, much of the legislation has been drafted with the protection of the average digital asset investor or purchaser in mind.

Built into the legislation are numerous powers granted to the BMA specifically to ensure that digital asset issuers are held accountable for their offerings. There are certain content requirements for the disclosure documents in order for potential investors to have as much information as possible.

For example, the persons behind the issuance should be disclosed. An appropriate risk warning must appear in the issuance document clearly setting out any rights or risks in relation to the digital assets being offered. That would include detailed information regarding the investor’s rights if the offering doesn’t proceed and what substantial risks there may be which are reasonably foreseeable.

The application to the BMA must include a copy of the issuance document for the BMA’s Fintech team to review, along with the applicant’s arrangements for the management of the offering. The BMA, of course, can request such other information it views as reasonably necessary in order for it to assess the application.

The BMA is further able to make rules relating to the issuance of the digital assets and these could cover matters such as risk management, information technology and cybersecurity, financial reporting, KYC, due diligence, recording keeping, custody arrangements and any other matter which the BMA deems appropriate.

A digital copy of the signed issuance document and whatever accompanying documents are required to comply with any rules promulgated by the BMA will all be published by the BMA. An electronic “communication facility” must also kept open during the period of the offering for people to be able to post messages, see messages made by others and ask the issuer questions regarding the offering.

The facility is an excellent way to allow the persons behind the issuance to be able to respond to any queries which potential investors may have prior to investing.

Other DAIA Protections

From a high level perspective, the DAIA has embedded within it certain other protections which are provided to the BMA:

  • a requirement to appoint a BMA-approved local representative who must report certain events to the BMA [e.g. a possibility of insolvency, failure to comply with BMA conditions, material misstatements, etc.];
  • material change approval requirements [such as if the entity plans to make a new offering of digital assets or wishes to make any change to its most recent issuance document];
  • imposing a broad range of conditions, prohibitions or requirements such as removing officers, limiting the scope of the issuance and entering into any other class of transactions;
  • revoking the authorisation to act as a digital asset issuer; and
  • winding up the issuer.

There are also requirements to seek the BMA’s no-objection in connection with changes to any 10% shareholder controller or a majority shareholder [i.e. one with more than a 50% controlling interest in the entity] and there are notification requirements if there are changes of directors, senior executives, managers or officers of the entity which must be made to the BMA.

In addition to the above, the BMA has been granted various disciplinary measures [e.g. injunctions, public censure and prohibition orders], rights to obtain information [including rights of entry, if need be] and investigation rights and powers to require documents.

Map and Flag of Bermuda [blodg]

Bermuda on black World Map. Map and flag of Bermuda.Regulation Works

Government: Bermuda is a jurisdiction which already has a Fintech framework in place. It has a very Fintech-friendly Government, led by Bermuda’s youngest ever Premier, the Hon. E David Burt. He is backed by a Fintech and Blockchain development team led by Denis Pitcher, an Economic Development Department and also several private industry Fintech committees and organisations looking to help the jurisdiction develop this industry.

To facilitate formations, there is a Government of Bermuda Concierge Service which will assist entities seeking to incorporate and set up their operations on the Island. This includes assistance on matters such as registering with the Department of Social Insurance, payroll tax registration and any necessary work permit applications from the Department of Immigration.

The Regulator: Bermuda is the second-largest reinsurance jurisdiction in the world and is regulated by an advanced and sophisticated regulator which has been operating and regulating insurance companies since 1978. It has a dedicated Fintech devoted team, which operates and regulates Fintech vehicles using very similar legislation to the Island’s insurance legislation.

Just as the BMA treats insurance applicants, its door is always open to DABA and DAIA applicants. They are happy to discuss applications prior to being formally filed; that is an opportunity which is invaluable to potential issuers thinking of starting down the road towards a Bermuda formation. From the BMA’s perspective, it can be an early opportunity to weed out applicants that clearly do not meet Bermuda’s standards.

Flexibility

The Government and its advisors remain very keen to stay on top of the Fintech environment and the industry it has already built. It has been repeated many times: Bermuda is in favour of quality over quantity. The island wants to admit those who understand this regime; not those who don’t.

Bermuda will listen, and it will assist those who do. Also, with the current financial uncertainty as a result of the COVID-19 virus, the new DAIA, along with the growing digital asset business sector, should help foster increased foreign direct investment into Bermuda’s economy.

The aim of DAIA is to attract those digital asset issuers to Bermuda who wish to be regulated in a well respected, blue-chip environment which could help potential purchasers overcome their fears of investing in an unregulated space. Based on the experience the island has with its current Fintech leadership, Bermuda has an understanding of the potential for over-regulation.

The tightrope between under-regulating and over-regulating in the digital asset world is a difficult one to navigate but Bermuda is confident it has struck the right balance.

With the regulatory and legal infrastructure that has been carefully assembled, the Island offers Fintech flexibility, an element very much required in this still nascent and ever evolving industry.