Skip to main content


The most innovative Yle projects in 2020-21: edge computing, opportunities in artificial intelligence, neural networks, synthetic sound, organic beaver...

Updated 28.01.2022 14:53.
A diagram showing how a 4K live could work.
Image caption A diagram showing how a 4K live could work.
Image: K4-tiimi / Yle

The first ever Yle Innovation Awards were presented on 7 December 2021. The award for innovation project of the year was given to Audio versions of news articles with speech synthesis by Ville Kinnunen, Noora Santavuori, Tapio Kaartinen, Mikael Hindsberg.

The award for the most educational innovation project of the year was given to Areena 4K Live Online broadcast.

The award for Innovator of the Year to Jyri Kivimäki / Sports and Events.

and the award for Most active user in Yle's ideation platform Viima to Jarkko Ryynänen/ News Lab.

A description of the ideation process by Yle Innovations.
Image: Satu Keto / Yle

But what is the Yle Innovation Award and where were these pilot projects born?

There’s plenty of enthusiasm for experimentation at Yle and a will to develop content and services. Sometimes, “real work” eats away at the time to brainstorm, and developing what already exists pushes the work on visions to the side. What’s mandatory overrides the important.

Aware of this, the Yle Innovation and Experimentation Network got to solving the problem. In spring 2021, Yle Innovations launched the Viima platform to facilitate the development of a culture of experimentation and the ability to join development projects within Yle. It's easy to put up a rough idea on Viima that can then be developed further together, refine the concept either on your own or with the help of Yle Innovations. After that, it's easy to pitch the idea to the community of peer developers in Yle Sandbox, find encouragement quickly as well as the necessary financial resources for either development or to hire a substitute. And then carry out the experiment quickly, free of pressure.

By December 2021, Viima has gained 402 users and since April, 50 new ideas and hundreds of thumbs-up and comments along with dozens of further discussions.

Seven innovation categories by Yle Innovations.
Image: Satu Keto / Yle

The following is a list of innovation projects that were completed during 2020-2021, of which we chose the Innovation Project of the Year.

Areena 4K Live Online broadcast

Partanen, Svärd, Pöri, Kivimäki, Kivinen, Ruohomäki, Saarelma, Ekman, Ljungberg

We tried out the ways that Yle could provide 4K Live broadcasts with the lowest possible effort on Areena. The trial was carried around the European Championships in football. Its scope was limited to football fans with UHD(HEVC) smart TVs with the Areena application. There is no service on the market that implements the entire chain for 4K live broadcasts. That is why we needed close cooperation within Yle for this trial. We wanted to trial a 4K live broadcast in devices that could make the most of it (Smart TVs).

What valuable lessons did we learn? The trial was very valuable, but it did not quite get to where we wanted. We were unable to achieve a real-enough test signal, as the delay on UHD broadcasts must be close to the level of current broadcasts. There were also some challenges with terminal devices as well as those with signal buffering and colours. We did not have the technical analytics. In addition, the public interest was lower than expected, and what we learned was that a market survey would have been needed along with more communication to increase the awareness of the experiment among potential audiences. We also understood that the choice of a good external partner is important, as is having a good division labour internally.

A cartoon image of an imaginary neural network and a explanatory speech bubble.
Image: Stiina Tuominen/ Yle

Audio deep fake - let's try and show how it’s done

UA Plusdesk: Tolvanen, Tuominen, Arponen, Lampén, Hallamaa, Lehtivuori & Paldanius, Lindfors, Mattila including the voices of Jenni Poikelus and Hesa-Äijä

The team experimented with implementing a neural network-based audio deepfake, from idea to the customer interface. The process was explained to the audience in an article and demonstrated in a podcast to its listeners. At the same time, we told the audience about deepfakes as a phenomenon, its risks and opportunities. The team produced interesting content for young audiences using new technology and familiar voices. The trial itself, as an application of technology for content needs, was interesting. The article and podcast on the trial reached many young audiences, and it also attracted people on Instagram. Both accessibility to and providing information to Finns about the opportunities offered by artificial intelligence are valuable.

A chart of responses.
Image: Teemu Lukkari / Yle

Audio versions of news articles with speech synthesis

Ville Kinnunen, Noora Santavuori, Tapio Kaartinen, Mikael Hindsberg

In this trial, we created audio version of our news articles with text-to-speech synthesis. Our primary purpose was to find out when and where audiences would prefer an audio version over text. This trial has a high focus on the future - and it's possible even today to provide 24/7 listenable news on Yle services, as text-to-speech is relatively easy to roll out in production. The easiest thing to do would be to start with a limited set of content as with the BBC benchmark, possibly selected according to the needs of the Live project.

The project was successful in creating valuable knowledge on how to automatically create high-quality audio publications from news articles. We also found that our audiences have an interest in audio versions of our articles. As well as the fact that the automatic text-to-speech quality is sufficient for Yle’s needs. The trial also supports Yle’s accessibility practices.

A picture of virtual studio.
Image: Markus Nygård/ Yle

Aximmetry - AR/virtual studio on the Unreal Engine

Joonas Lintunen, Markus Nygård, Vilma Juhava, Jaakko Palvaila, Otto Rönkä

This project was a conceptual experiment into the creation of a cost-effective virtual studio that would be easy to use and that would enable HTC Vive tracking as well as the use of PTZ cameras. We carried out a successful experiment, which resulted in an impressive virtual studio with relatively low cost. This trial makes use of existing technology and combines them into a highly cost-effective way to create a usable virtual studio. At the same time, we gained the ability create a virtual studio with only a few thousand euros of investment by using existing facilities. Aximmetry was used in the Autumn 2021 eSports productions.

A ultra-run event. Man running and an other man streaming with mobile phone.
Image: Sacha Lagrillière / Yle

Nuuksio Backyard Ultra

Sacha Lagrillière, Jyri Kivimäki

The aim of this experiment is to broadcast the Nuuksio Backyard Ultra cross-country running event in a way that combined both an ultralight production method as well as a high degree of interaction with the audience. Finally, the 42-hour, continuous broadcast took place with only iPhone smartphones, five of which were placed at different parts of the 6.7km forest path. The video signals, transferred through the mobile network, were controlled with the free-to-use OBS Studio live streaming software as well as start-up company Kiswe's cloud-based production environment. The event-based nature of the broadcast was highlighted by additionally producing a GPS monitoring online map image, which made it possible to track the top competitors even outside the image feeds in real time. Our ultralight method made it possible to only have two persons in charge of the entire broadcast.

The experiment was quite innovative as the sports event, Backyard Ultra, is different from a typical sporting event all the while the production model for broadcasting it was particularly experimental.


Timo Heikkinen, Joonas Lintunen, Tapio Kantele

In this trial, an AI was taught to identify a target from a long-running nature stream, i.e. a beaver. This trial was to test how much of a help an AI could be in editorial work. It is laborious to create best of clips from long nature-focused livestreams due to the large volume of footage. The results showed that effective detection helps with finding the useful sections. In the trial, beaver identification was not spotless and the filming conditions were challenging due to the darkness, among other things. It is an innovative way to use technology that was intended for something else in the needs of nature content.

A picture presenting edge computing.
Image: Immonen Antti/ Yle

Edge computing in cloud-based TV production

Antti Immonen and team

We trialled an edge computing service with Bright and Also which virtualises TV production processes in a cloud workstation: sound and image mixing as well as live graphics, using the desktop software vMix. At the same time, we made remote production over IP possible with the NDI Video Over IP protocol. The server operated on the so-called edge computing principle and was equipped with the required virtual CPU and GPU power to make real-time TV production possible.

What valuable lessons did we learn? When operating in the cloud, data security management is essential. Finding the right tech partner is vitally important. Likewise is finding a suitable content partner inside Yle to make it easier to explore how technology bends to real production situations. It is also important to assess the complexity of the experiment beforehand and to allocate sufficient time based on it.

Quick clipping and publishing of live broadcasts

Eeva Partanen, Jussi Pöri, Jyri Kivimäki

We trialled with Wildmoka as a partner the use of a modern tool to create clips of livestreams for Yle’s primary services in as quick a manner as possible.

What valuable lessons did we learn? Sometimes, experiments go a long way back - a familiar technology from years ago proved to work - what we thought would be possible was, indeed, possible. This was a useful experiment.

Virtual presenter Sean Ricks attending on Match XR-event.
Image: Satu Keto / Yle

Synthetic presented for XR Factory event

Olli-Pekka Salli

In this experiment, we implemented a virtual presenter for the XR Factory 2020 event, which was also held virtually due to covid, whose focuses were on industrial XR and media XR. Yle’s virtual spaces were integrated into the entire event with the presenter. Presenter Sean Ricks, known for the Yle Friday programme, visited the exhibition on several occasions in avatar form, carrying out interviews for the audience of the livestream - just as he would have in the physical world. With this in mind, we created a personal avatar for Sean Ricks. The entire event was hosted at Yle’s Mediapolis virtual studio. (Virtual Ricks makes a short appearance at the end of this video) This fully virtual expo left a its mark on the makers, partners and event guests. There are already many new test plans for using the pilot: virtual preview of scenography (in development), virtual visits to Yle studios (in testing), virtual media education game Uutishuone (Complete in December 2021).

Synthetic version of Jari Lahti
Image: Satu Keto / Yle

Synthetic Yari

Yle Innovations: Wesa Aapro, Jouni Frilander, Satu Keto, Jari Lahti

This trial was to produce a synthetic version of Jari Lahti for the Match XR 2020 event in AltSpace VR to hold a presentation on behalf of Real Jari. We wanted to see if such a virtual bluff would work and how hard it would be to try. As an additional factor, we were looking into if the audience can tell that Yari isn’t a human. The trial revealed to us that creating a synthetic Yle person is surprisingly easy. In the future, the technology we tested will enable synthetic presenters, journalists, assistants, new "customer interfaces" for the Yle service and so on. With this trial, we learned to create a virtual agent for Yle Innovations as well as the virtual Altspace VR environment. It led into a follow-up trial, in which we combined the character, created with Synthesia, to voice modulation/cloning, which also enables future marketing experiments where synthetic versions of personal brands can be used create promotional material in place of a “real” Yle person.

AI to make people unnecessary in the European Football Championships

Eeva Partanen, Jussi Pöri, Jyri Kivimäki

The aim of this trial was to see if it would be possible for an AI to find the goals in football matches and publish them in Areena for viewers without the need for human labour - hands-off and quickly. The outcome was that it is possible, and it comes with an added bonus of an automatically compiled review of a match. We learned to produce and publish clips of a livestream quickly and fully automatically. Our old practices and workflows were renewed with AI and automation to a completely new level while adding to customer value: clips for watching sooner than before.

Mobile phone with virtual stand.
Image: Satu Keto / Yle

Yle and virtual theatres

Sacha Lagrillière, Jyri Kivimäki

In this trial, we piloted a function that would make it possible for a TV audience to gather in a kind of virtual theatre, where it’s possible to watch selected content in a group of 2-6 people. In an integrated form, the service makes it possible to watch everything published on our streaming service (Areena) in such a virtual theatre, but the pilot only applied to selected live TV broadcasts from Spring and Summer 2021: The Eurovision preliminaries and finals, the European Football Championships and the Tokyo Olympics. Yle viewers were given the opportunity to create virtual theatres when the said broadcasts were live. The pilot was implemented by guiding participants to a website maintained by Sceenic which was “masked” as an Yle service.

What valuable lessons did we learn? This could be a possible additional service for Yle’s main services? Participants said they liked it, but the user experience needs to be “ridiculously easy”. Data also showed that viewing times were longer in larger groups compared to “inclusive” groups, i.e. inviting people into the rooms is important, as well as marketing the service. In the future, we could also make use of other, third-party platforms to create a similar viewing experience.

3D scanning to support production planning

Petri Karlsson, Pauliina Leino, Juha Harju, Rami Lindholm, David Reilly

This trial looked into the useability of the point cloud created in 3D scanning in the work of the various design departments, such as graphic designers, scenographers, lighting and cinematography techs, and directors. The aim was also to improve and harmonise the planning work between these groups and to initiate cooperation as soon as at the planning stage. The trial created effective ways to model spaces for production planning, and it has already led to further development in the service of production planning and future productions in virtual reality.

In other languages