Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Title

Sort by: Order: Results:

  • Nyman, Thomas (2014)
    Operating System-level Virtualization is virtualization technology based on running multiple isolated userspace instances, commonly referred to as containers, on top of a single operating system kernel. The fundamental difference compared to traditional virtualization is that the targets of virtualization in OS-level virtualization are kernel resources, not hardware. OS-level virtualization is used to implement Bring Your Own Device (BYOD) policies on contemporary mobile platforms. Current commercial BYOD solutions, however, don't allow for applications to be containerized dynamically upon user request. The ability to do so would greatly improve the flexibility and usability of such schemes. In this work we study if existing OS-level virtualization features in the Linux kernel can meet the needs of use cases reliant on such dynamic isolation. We present the design and implementation of a prototype which allows applications in dynamic isolated domains to be migrated from one device to another. Our design fits together with security features in the Linux kernel, allowing the security policy influenced by user decisions to be migrated along with the application. The deployability of the design is improved by basing the solution on functionality already available in the mainline Linux kernel. Our evaluation shows that the OS-level virtualization features in the Linux kernel indeed allow applications to be isolated in a dynamic fashion, although known gaps in the compartmentalization of kernel resources require trade-offs between the security and interoperability to be made in the design of such containers.
  • Bankowski, Victor (2021)
    WebAssembly (WASM) is a binary instruction format for a stack-based virtual machine originally designed for the Web but also capable of being run on outside of the browser contexts. The WASM binary format is designed to be fast to transfer, load and execute. WASM programs are designed to be safe to execute by running them in a memory safe sandboxed environment. Combining dynamic linking with WebAssembly could allow the creation of adaptive modular applications that are cross-platform and sandboxed but still fast to execute and load. This thesis explores implementing dynamic linking in WebAssembly. Two artifacts are presented: a dynamic linking runtime prototype which exposes a POSIX-like host function interface for modules and an Android GUI interfacing prototype built on top of the runtime. In addition the results of measurements which were performed on both artefacts are presented. Dynamic linking does improve the memory usage and the startup time of applications when only some modules are needed. However if all modules are needed immediately then dynamic linked applications. perform worse than statically linked applications. Based on the results, dynamically linking WebAssembly modules could be a viable technology for PC and Android. The poor performance of A Raspberry Pi in the measurements indicates that dynamic linking might not be viable for resource contrained system especially if applications are performance critical.
  • Nurminen, Niilo Waltteri (2021)
    Phase transitions in the early Universe and in condensed matter physics are active fields of research. During these transitions, objects such as topological solitons and defects are produced by the breaking of symmetry. Studying such objects more thoroughly could shed light on some of the modern problems in cosmology such as baryogenesis and explain many aspects in materials research. One example of such topological solitons are the (1+1) dimensional kinks and their respective higher dimensional domain walls. The dynamics of kink collisions are complicated and very sensitive to initial conditions. Making accurate predictions within such a system has proven to be difficult, and research has been conducted since the 70s. Especially difficult is predicting the location of resonance windows and giving a proper theoretical explanation for such a structure. Deeper understanding of these objects is interesting in its own right but can also bring insight in predicting their possibly generated cosmological signatures. In this thesis we have summarized the common field theoretic tools and methods for the analytic treatment of kinks. Homotopy theory and its applications are also covered in the context of classifying topological solitons and defects. We present our numerical simulation scheme and results on kink-antikink and kink-impurity collisions in the $\phi^4$ model. Kink-antikink pair production from a wobbling kink is also studied, in which case we found that the separation velocity of the produced kink-antikink pair is directly correlated with the excitation amplitude of the wobbling kink. Direct annihilation of the produced pair was also observed. We modify the $\phi^4$ model by adding a small linear term $\delta \phi^3$, which modifies the kinks into accelerating bubble walls. The collision dynamics and pair production of these objects are explored with the same simulation methods. We observe multiple new effects in kink-antikink collisions, such as potentially perpetual bouncing and faster bion formation in comparison to the $\phi^4$ model. We also showed that the $\delta$ term defines the preferred vacuum by inevitably annihilating any kink-antikink pair. During pair production we noticed a momentum transfer between the produced bion and the original kink and that direct annihilation seems unlikely in such processes. For wobbling kink - impurity collisions we found an asymmetric spectral wall. Future research prospects and potential expansions for our analysis are also discussed.
  • Kähönen, Simo (2020)
    Context. Software Product Line (SPL) is a set of software system products that have common features and product-specific features. Dynamic Software Product Line (DSPL) is an SPL that features runtime variability. Objective. The main objective of this study is to evaluate the latest research related to SPL dynamic variability in general. The second objective of this study is to investigate dynamic variability modeling methods and tools that are utilized and introduced for SPLs by the scholars. The third objective of this study is to investigate testing methods and tools that are utilized and introduced for DSPLs by the scholars. Method. The scientific research method of this study is Systematic Literature Review (SLR). Papers included are published between years 2015 and 2017. Four scientific digital libraries were used as data sources for the papers. Results. The main result of this study is that between years 2015 and 2017, there has been an active research community studying SPL dynamic variability. For all 25 papers included in this study, on a scale of 0 to 10, the arithmetic mean of the quality scores is 7.14 (median is 7.5). One industrial practice DSPL implementation case study was presented by the scholars. Three other case studies seemed to be more or less simplified exemplar of industry practice DSPL implementations. Two studies were focusing in testing aspects of DSPLs. The second result of this study is that scholars have utilized 19 existing dynamic variability modeling methods for SPLs, and introduced 17 new dynamic variability modeling methods for SPLs. Scholars have utilized seven existing dynamic variability modeling tools for SPLs, and introduced four new dynamic variability modeling tools for SPLs. The third result of this study is that scholars have introduced four new testing methods for DSPLs, and utilized two existing testing tools for DSPLs. Conclusions. The general conclusion of this study is that albeit SPL dynamic variability has been actively studied between years 2015 and 2017, there are still open research areas, especially in the field of industry practice use and testing of DSPLs. 2012 ACM Computing Classification System (CCS): Software and its engineering -> Software creation and management -> Software development techniques -> Reusability -> Software product lines Software and its engineering -> Software creation and management -> Designing software -> Software design engineering
  • Kainulainen, Henna (2015)
    In this thesis we consider dynamic X-ray computed tomography (CT) for a two dimensional case. In X-ray CT we take X-ray projection images from many different directions and compute a reconstruction from those measurements. Sometimes the change over time in the imaged object needs to be taken into account, for example in cardiac imaging or in angiography. This is why we're looking at the dynamic (something changing in time, while taking the measurements) case. At the beginning of the thesis in chapter 2 we present some necessary theory on the subject. We first go through some general theory about inverse problems and the concentrate on X-ray CT. We talk about ill-posedness of inverse problems, regularization and the measurement proses in CT. Different measurement settings and the discretization of the continuous case are introduced. In chapter 3 we introduce a solution method for the problem: total variation regularization with Barzilai-Borwein minimization method. The Barzilai-Borwein minimization method is an iterative method and well suited for large scale problems. We also explain two different methods, the multi-resolution parameter choice method and the S-curve method, for choosing the regularization parameter needed in the minimization process. The 4th chapter shows the materials used in the thesis. We have both simulated and real measured data. The simulated data was created using a rendering software and for the real data we took X-ray projection images of a Lego robot. The results of the tests done on the data are shown in chapter 5. We did tests on both the simulated and the measured data with two di erent measurement settings. First assuming we have 9 xed source-detector pairs and then that we only one source-detector pair. For the case where we have only one pair, we tested the implemented regularization method we begin by considering the change in the imaged object to be periodic. Then we assume can only use some number of consecutive moments, based on the rate the object is chancing, to collect the data. Here we only get one X-ray projection image at each moment and we combine measurements from multiple di erent moments. In the last chapter, chapter 6, we discuss the results. We noticed that the regularization method is quite slow, at least partly because of the functions used in the implementation. The results obtained were quite good, especially for the simulated data. The simulated data had less details than the measured data, so it makes sense that we got better results with less data. Already with only four angles, we cold some details with the simulated data, and for the measured data with 8 angles and with 16 angles details were also visible in the case of measured data.
  • Nurmi, Marisofia (2021)
    Globally, there is a constant shortfall of financial resources in conservation, which has partially been supplemented by combining conservation and conservation-compatible businesses. Many protected and conserved areas in sub-Saharan Africa are largely funded by revenues generated within the area, mainly through ecotourism. While ecotourism revenues are bringing in money into the system, dependency on this single type of revenue source is making conservation areas – or even the whole protected area system – vulnerable to changes in visitor numbers, which are prone to different political or socio-economic disturbances (such as conflicts, economic recession, and epidemics). A sudden substantial decrease in revenues or increase in costs may threaten the existence, extent, and quality of conservation areas in terms of biodiversity conservation. Collecting and analysing economic information on protected and conserved areas can help investigate their long-term sustainability and resilience to financial threats, such as the COVID-19 pandemic and related economic outcomes. In this thesis, I assess how conservation costs and revenues vary between different types of protected and conserved areas, how financially self-sufficient they are, and how economically resilient these areas may be in the face of global changes. The analysis is based on financial data from different types of protected and conserved areas in South Africa: state-owned national parks (South African National Parks, later SANParks), provincial parks (Ezemvelo KwaZulu-Natal Wildlife, later Ezemvelo) and private conserved areas. With the use of simulation modelling and resilience theory, I discuss how potential economic resilience varies between protected areas. The findings indicate that there are significant differences in the cost-revenue structure of different kinds of protected and conserved areas, and especially between public and private. Ezemvelo receives most of its funds from the provincial government, whereas SANParks covers the majority of its costs from tourism revenues. Private game reserves again need to cover their costs independently. According to the findings, size is an important attribute to predict the per hectare net income and running costs of public protected areas but has no significant influence on those of private game reserves. For public protected areas, the running costs per hectare are significantly higher for protected areas less than 1000 hectares. Based on the economic modelling and resilience theory, I concluded that private game reserves are generally financially more viable, but their vulnerability lies in their lack of embeddedness within a larger system (e.g., a conservation organization) that could support them during difficult times and require and encourage a long-term commitment to conservation. The economic resilience of public protected areas is more closely tied to the political atmosphere regarding conservation funding: self-generated revenues form only a part of the budgets of public protected areas. In addition, protected areas which have large fixed costs and depend on high tourism revenues are likely to be less economically resilient. Because of the higher running costs and resultant sensitivity of net income to changes in costs and revenues, parks that are home to the “Big Five” species (lion, leopard, rhino, elephant and buffalo) are in a more vulnerable position in the face of disturbances, as the pandemic. To address the threats that upcoming socio-economic disturbances pose to the funding base of protected and conserved areas, more focus should be given to the economic resilience of these areas, especially in countries and occasions where the areas rely on self-generated revenues.
  • Kurvinen, Pasi (Helsingin yliopistoUniversity of HelsinkiHelsingfors universitet, 2003)
  • Piekkola, Elina (2013)
    East Usambara Mountains situated in North-Eastern Tanzania, are globally recognized tropical forests that have a high biodiversity value. The Amani Nature Reserve encloses a high concentration of endemic species within versatile biodiversity. The aim of this study is to measure the ecotourism possibilities and potential of the Amani Nature Reserve and to provide sustainable areal development and livelihood option and outline the regional characteristic important in terms of ecotourism. The data for this study was gathered during a field trip in Tanzania, in January-March 2012, as part of an internship for WWF Finland, Coastal East Africa Initiative. The qualitative methods used included a structured questionnaire, semi-structured and in-depth interviews, field observation and literature analysis. Also, several discussions between different regional stakeholders were carried out. Six villages in the East Usambara Mountains are studied. The study concludes that the Amani Nature Reserve has high potential for ecotourism development and the area offers diverse nature related activities albeit the current visitor statistics are low. The overall results indicate the high value and possibilities with regional biodiversity, the locals' positive attitudes towards tourism but also the areal weaknesses; poor infrastructure, lack of facilities and services. The locals' willingness to cooperate and participate in ecotourism functions and existing cultural assets were also recognized. The Amani Nature Reserve's location, uniqueness and existing facilities strongly support the future ecotourism development. However the locals' knowledge on tourism impacts and conservation issues should be reinforced because there are currently multiple threats towards these tropical forests such as population growth and forest fragmentation. Ecotourism could reinforce forest conservation, local empowerment and sustainable livelihoods. In order to safeguard the ecotourism resource base, the environment, the ecotourism actions need to follow the ecotourism objectives and principles and consider different spatial environmental, social and economic characteristics. According to these principles the locals must be integrated in actions and decision-making processes at all levels and careful ecotourism planning, management and monitoring must take place. The ecotourism network development in Tanzania is highly possible because of the country's spectacular natural beauty and political stability. In order to safeguard the remaining life supporting wildlife also different stakeholders and locals should be engaged to work in cooperation seeking sustainable conservation means, such as ecotourism.
  • Häggblom, Svante (2019)
    Background: User experience (UX) is seen as an important quality of a successful product and software companies are becoming increasingly interested in the field of UX. As UX has the goal to improve the experience of users, there is a need for better methods in measuring the actual experience. One aspect of UX is to understand the emotional aspect of experience. Psychophysiology studies the relations between emotions and physiology and electrodermal activity (EDA) has been found to be a physiological measurement of emotional arousal. Aims: The aim of this thesis is researching the utility of measuring EDA to identify moments of emotional arousal during human-computer interaction. By studying peaks in EDA during software interaction we expect to find issues in the software that work as triggers or stimuli for the peaks. Method: We used the design science methodology to develop EDAMUX. EDAMUX is a method to unobtrusively observe users, while gathering significant interaction moments through self reporting and EDA. A qualitative single-case study was conducted to evaluate the utility of EDAMUX. Results: We found that we can discover causes of bad user experience with EDAMUX. Moments of emotional arousal, derived from EDA, was found in conjunction with performance issues, usability issues and bugs. Emotional arousal was also observed during software interaction where the user was blaming themself. Conclusions: EDAMUX shows potential in discovering issues in software that are difficult to find with methods that rely on subjective self-reporting. Having the potential to objectively study emotional reactions is seen as valuable in complementing existing methods of measuring user experience.
  • Shestovskaya, Jamilya (2020)
    Nowadays the number of connected devices is growing sharply. Mobile phones and other IoT devices are inherent parts of everyday life and used everywhere. The amount of data generated by IoT devices and mobile phones is enormous, which causes network congestions. In turn, the usage of centralized cloud architecture increases delay and cause jitter. To address those issues the research community discussed the new trend of decentralization – edge computing. There are different edge compute architectures suggested by various researchers. Some are more popular and supported by global companies. Most of those architectures have similarities. In this research, we reviewed seven edge compute architectures. This thesis is a comparative analysis carried out by using key attributes and presentation of the Venn diagram to select the right edge compute architecture.
  • Kovala, Jarkko (2020)
    Internet of Things (IoT) has the potential to transform many domains of human activity, enabled by the collection of data from the physical world at a massive scale. As the projected growth of IoT data exceeds that of available network capacity, transferring it to centralized cloud data centers is infeasible. Edge computing aims to solve this problem by processing data at the edge of the network, enabling applications with specialized requirements that cloud computing cannot meet. The current market of platforms that support building IoT applications is very fragmented, with offerings available from hundreds of companies with no common architecture. This threatens the realization of IoT's potential: with more interoperability, a new class of applications that combine the collected data and use it in new ways could emerge. In this thesis, promising IoT platforms for edge computing are surveyed. First, an understanding of current challenges in the field is gained through studying the available literature on the topic. Second, IoT edge platforms having the most potential to meet these challenges are chosen and reviewed for their capabilities. Finally, the platforms are compared against each other, with a focus on their potential to meet the challenges learned in the first part. The work shows that AWS IoT for the edge and Microsoft Azure IoT Edge have mature feature sets. However, these platforms are tied to their respective cloud platforms, limiting interoperability and the possibility of switching providers. On the other hand, open source EdgeX Foundry and KubeEdge have the potential for more standardization and interoperability in IoT but are limited in functionality for building practical IoT applications.
  • Sinikallio, Laura (2022)
    Parlamentaaristen aineistojen digitointi ja rakenteistaminen tutkimuskäyttöön on nouseva tutkimuksenala, jonka tiimoilta esimerkiksi Euroopassa on tällä hetkellä käynnissä useita kansallisia hankkeita. Tämä tutkielma on osa Semanttinen parlamentti -hanketta, jossa Suomen eduskunnan täysistuntojen puheenvuorot saatetaan ensimmäistä kertaa yhtenäiseksi, harmonisoiduksi aineistoksi koneluettavaan muotoon aina eduskunnan alusta vuodesta 1907 nykypäivään. Puheenvuorot ja niihin liittyvät runsaat kuvailutiedot on julkaistu kahtena versiona, parlamentaaristen aineistojen kuvaamiseen käytetyssä Parla-CLARIN XML -formaatissa sekä linkitetyn avoimen datan tietämysverkkona, joka kytkee aineiston osaksi laajempaa kansallista tietoinfrastruktuuria. Yhtenäinen puheenvuoroaineisto tarjoaa ennennäkemättömiä mahdollisuuksia tarkastella suomalaista parlamentarismia yli sadan vuoden ajalta monisyisesti ja automatisoidusti. Aineisto sisältää lähes miljoona erillistä puheenvuoroa ja linkittyy tiiviisti eduskunnan toimijoiden biografisiin tietoihin. Tässä tutkielmassa kuvataan puheenvuorojen esittämistä varten kehitetyt tietomallit ja puheenvuoroaineistojen keräys- ja muunnosprosessi sekä tarkastellaan prosessin ja syntyneen aineiston haasteita ja mahdollisuuksia. Toteutetun aineistojulkaisun hyödyllisyyden arvioimiseksi on Parla-CLARIN-muotoista aineistoa jo hyödynnetty poliittiseen kulttuuriin liittyvässä digitaalisten ihmistieteiden tutkimuksessa. Linkitetyn datan pohjalta on kehitetty semanttinen portaali, Parlamenttisampo, aineistojen julkaisemista ja tutkimista varten verkossa.
  • Katila, Nina (2020)
    Tutkielmassa tarkastellaan tieteellisen tutkimuksen ja ammattikirjallisuuden pohjalta keinoja ja näkökohtia tietojärjestelmien integraatiotestauksen automatisointiin. Tutkimusmetodologiana on tapaustutkimus (Case Study). Tutkimuksen tapausympäristönä on eduskunnan lainsäädäntötyön tietojärjestelmien välisen integraatiotestauksen automatisoinnin edellytykset ja eri toteutusvaihtoehdot. Tutkielmaa varten tietoa on kerätty eduskunnan tietojärjestelmien dokumentaatiosta sekä eduskunnan tietojärjestelmien asiantuntijoilta. Eduskunnan integraatiotestauksen työnkulut ja testauksen haasteet perustuvat havaintoihin, joita on tehty osallistumalla eduskunnan integraatiotestaukseen noin vuoden ajan. Automatisointivaihtoehtojen analysointi ja evaluointi perustuvat jatkuvaan yhteistyöhön eduskunnan lainsäädäntötyön tietojärjestelmien ja integraatiojärjestelmän asiantuntijoiden kanssa. Eduskunnan lainsäädäntötyön tietojärjestelmät ovat toiminnallisesti ja hallinnollisesti itsenäisiä sekä toteuttajiltaan, toteutukseltaan ja iältään erilaisia. Koska itsenäisten järjestelmien ohjelmakoodi ei ole järjestelmien välillä saatavilla, on integraatiotestauksen automatisoinnin ratkaisun perustuttava järjestelmien käyttöliittymien kautta saavutettavaan koodiin. Tutkimuksessa havaittiin, että ohjelmistorobotiikan (Robotic Process Automation, RPA) avulla voidaan jäljitellä eduskunnan testaajien järjestelmien käyttöliittymien kautta suorittamaa integraatiotestausta. Tutkimuksessa havaittiin, että testauksen automatisointiin ja ohjelmistorobotiikkaan soveltuvan automatisointikehyksen avulla on mahdollista automatisoida eduskunnan tietojärjestelmien integraatiotestaus. Ohjelmistorobotiikalla integraatiotestit saadaan suoritettua manuaalista testausta merkittävästi nopeammin ja vähemmillä resursseilla. Käyttöliittymiin perustuvan testausautomaation merkittävin haittapuoli on testien ylläpidon kustannukset. Modulaarisen avainsanapohjaisen automatisointikehyksen avulla voidaan tietojärjestelmien automatisoituja testejä ja niiden osia käyttää uudelleen integraatiotestauksen automatisoinnissa ja näin säästää kustannuksissa.
  • Vihko, Sami Vihko (2022)
    We will review techniques of perturbative thermal quantum chromodynamics (QCD) in the imaginary-time formalism (ITF). The Infrared (IR)-problems arising from the perturbative treatment of equilibrium thermodynamics of QCD and their phenomenological causes will be investigated in detail. We will also discuss the construction of two effective field theory (EFT) frameworks most often used in modern high precision calculations to overcome these. The EFTs are the dimensionally reduced theories EQCD and MQCD and Hard thermal loop effective theory (HTL). EQCD is three-dimensional Euclidean Yang-Mills theory coupled to an adjoint scalar field and MQCD is three-dimensional Euclidean pure Yang-Mills theory. The effective parameters in these theories are determined through matching calculations. HTL is based on resummation of hard thermal loops and uses effective propagators and vertex functions. We will also discuss the determination of the pressure of QCD perturbatively. In general, this thesis details calculations and the methodology.
  • Räsänen, Hannele (2020)
    Nowadays, with the influence of global economy large corporations use global software development to utilise advantages of geographically decentralised organisations and global outsourced software development. Through distributed organisations the work can be done around the clock. Global software development is impacted by three distance dimensions: time distance, geographical distance, and socio-cultural distance, which all bring some challenges. At the same time agile way of working has become more and more popular method in software development. As agile practises are created for co-located teams there is a demand for having working online solutions for communication and collaboration in distributed teams. Corporations use scaled agile way of working to support software develop-ment of large initiatives and projects. Scaled Agile Framework (SAFe) is the most popular among the scaled agile methods. This thesis was conducted as a case study in a multinational corporation. Objective of the case study was to research effectiveness of scaled agile methodology SAFe on communication and collaboration in teams and agile release trains. The case study included two parts: a web-survey and interviews. The results of the analyses of the case study support findings from the literature in the field. The results indicate the importance of communication and collaboration in agile practices and the significance of the online tools that support it.
  • Martinmäki, Petri (2013)
    The main purpose of this master's thesis is to present experiences of test automation in an industrial case, and to make recommendations of the best practices for the case. The recommendations are based on successful test automation stories found from the existing literature. The main issues that are hindering test automation seem to be similar in the case example and in the literature. The cost of implementation and maintenance combined with unrealistic expectations are perceived in almost every project. However, in the most successful projects, they have put a lot of effort to plan and implement maintainable sets of automatic tests. In conclusion, the evidence from the literature shows that successful test automation needs investments - especially in the beginning of the project. A few specific best practices are adapted to the case project, and presented in the form that they could be applied.
  • Mäkelä, Ville (2022)
    With their ability to convert chemical energy to electrical energy through electrochemical reactions, rechargeable batteries are widely used to store energy in various applications such as electronic mobile equipment, aerospace aviation, road transportation, power grid, and national defense industry Numerous battery types are available commercially. Lithium ion-based batteries stand out due to several key advantages such as high operating voltage, high specific energy, and long cycle life. They also have a market dominance in a wide range of electric vehicles. However, like all battery technologies, lithium ion-based ones suffer from the effects of aging-induced degradation which can lead to reduced capacity, lifetime, and in some cases even safety hazards. One method of preventing/slowing down these aging reactions is to modify the standard battery materials by using dopants and additives. They are specific impurities purposely introduced into the battery during the manufacturing process. In this master’s thesis, the effect of additives (Mg/Al) on the aging of Li-ion cells was examined by using X-ray absorption spectroscopy, more specifically x-ray absorption near edge structure (XANES). For the experiment, 7 different cells, all containing lithium cobalt oxide as the major component (with 4 having a stoichiometric ration of Li/Co, and 3 being Li-rich), with 5 of them containing Mg/Al as dopants, and 2 containing no dopants were examined using XANES as a function of aging in terms of charge/discharge cycles. The dopants were introduced at different stages of the material preparation, either at the lithiation step or at the synthesis of the precursor. This thesis focuses on the XANES experiment and the data analysis, with extensive literature review on the topic of using additives and dopants. The cells were prepared by the Aalto University. The results showed that of the cells with dopant materials, the cells doped during lithiation stage aged slightly better after cycling than the undoped ones, whereas the cells doped during precursor stage aged worse than the undoped cells. This would suggest that doping might be more effective when done during the lithiation stage.
  • Suorsa, Matti Valtteri (2017)
    In Finland, the spent nuclear fuel will be deposited at a depth of 400 m in the granitic bedrock. The disposal is based on KBS-3 concept, which relies on the multi-barrier principle, where different successive barriers prevent the migration of radionuclides to biosphere. The spent nuclear fuel is placed in the disposal tunnels in copper-iron canisters, which are surrounded by bentonite clay to insulate them from the groundwater flow and protect from the movements of the bedrock. Bentonite clay consists mainly of montmorillonite, which like the other aluminosilicates are known to retain radionuclides thus, contributing to the retention or immobilization of them. Besides the contribution to the multi-barrier system, the bentonite buffer is assumed to be a potential source of colloids due to the erosion of bentonite in certain conditions. Colloids in the context of radionuclide migration are nanoparticles in the size range from 1 to 1000 nm that remain suspended in water. The montmorillonite colloids could potentially act as carriers for otherwise immobile radionuclides like transuranium elements in the case of canister failure. Especially, 241Am is an important radionuclide regarding the long-term safety of the final disposal as after a few hundred years 241Am and its mother 241Pu contribute most to the radiotoxicity of the spent nuclear fuel. The relevance of the colloids to the long-term performance is depending on several factors like colloid stability, mobility and their interaction with radionuclides. The colloid stability is depending on the groundwater conditions like ionic strength and pH. In low salinity groundwaters, the montmorillonite colloids have been shown to be stable. On the other hand, the collective processes of the rock matrix, bentonite colloids and radionuclides have to be investigated to assess the long-term performance of the multi-barrier system. It requires the combination of the different scale experiments from the simple laboratory experiments to large, natural scale in-situ experiments to understand the complex processes affecting the colloid-facilitated radionuclide migration. The large-scale laboratory experiments conducted with granite blocks offer an intermediate between the two extremes having a more natural system than the former and a better controllability than the latter. In this study, the radionuclide migration was studied in different scale laboratory experiments. The colloid-facilitated transport of Eu was studied with a block-scale experiment using a granite block with a natural water conducting fracture. The suitability of the block was assessed by conducting several experiments using different non-sorbing and sorbing tracer and montmorillonite colloids separated from synthetic Ni-labeled montmorillonite and Nanocor PGN Montmorillonite (98 %). Laser-induced breakdown detection (LIBD), photon correlation spectroscopy (PCS) and ICP-/MP-OES were utilized in colloid detection. Supportive batch experiments were conducted to study the colloid stability in different ground waters and the interaction between the granite, different montmorillonite colloids and Eu, an analog to Am. Good reproducibility was obtained with non-sorbing tracers. The breakthrough of the radioactive 3H, 36Cl and fluoresceine and Amino-G dyes showed similar behavior. On the other hand, no breakthrough of montmorillonite colloids or 152Eu occurred. Based on the literature review, the low flow rates used could be the reason for this. Low flow rate (50 μl/min) could affect the colloid mobility strongly which could explain that Eu retained in the fracture. More experiments with higher flow velocities would be required. Different montmorillonite materials showed similar but not exact the same sorption behavior of Eu. The fraction of Eu attached to colloids decreased during the experiments and correspondingly the fraction attached to the granite increased. At the same time, colloids remained stable during the expertiments. This indicates that desorption of Eu from the colloids is taking place in the presence of granite. Also, the effect of different water composition on the stability of colloids was clearly seen on the preparation of colloid suspensions in different water simulants. Even a small increase in the ionic strength of the solution made the especially Ni-montmorillonite colloids instable.
  • Lehtinen, Simo (2021)
    The solar corona constantly emits a flow of charged particles, called the solar wind, into interplanetary space. This flow is diverted around the Earth by the magnetic pressure of the Earth’s own geomagnetic field, shielding the Earth from the effect of this particle radiation. On occasion the Sun ejects a large amount of plasma outwards from the corona in an event called a Coronal Mass Ejection (CME). Such events can drive discontinuities in the solar wind plasma, called interplanetary shocks. Shocks can affect the Earth’s magnetosphere, compressing it inwards and generating electromagnetic waves inside it. In this thesis we will cover a study of the ultra-low frequency (ULF) wave response in the magnetosphere to CME-driven shocks. Geomagnetic pulsations are ultra-low frequency plasma waves in the magnetosphere, observable from ground-based magnetometers. The compression of the magnetosphere by interplanetary shocks generates geomagnetic pulsations in the Pc4 and Pc5 frequency ranges (2 - 22 mHz). These waves play an important role in magnetospheric dynamics and the acceleration and depletion of high energy electrons in the radiation belts. We consider 39 interplanetary shock events driven by CMEs, and analyse ground-based magnetometer data from stations located near local noon at the time of the shock arrival. Solar wind measurements are used to categorise interplanetary shocks based on their Mach number and the dynamic pressure differential as main indicators of shock strength. The importance of these parameters in determining the strength of the wave response in the geomagnetic field is then studied using wavelet analysis and superposed epoch analysis. Stronger shocks are found to result in larger increases in wave activity, especially in the Pc4 range. Ground stations at higher latitudes observe higher wavepower, but there is an interesting anomaly in the Pc4 range at stations magnetically connected to regions near the plasmapause, which show an enhanced wavepower response. We quantify the decay time of the wave activity and find that it is around 20 hours for Pc5 waves and 7 hours for Pc4 waves.
  • Pakkanen, Noora (2021)
    In Finland, the final disposal of spent nuclear fuel will start in the 2020s where spent nuclear fuel will be disposed 400-450 meters deep into the crystalline bedrock. Disposal will follow Swedish KBS-3 principle where spent nuclear fuel canisters will be protected by multiple barriers, which have been planned to prevent radionuclides´ migration to the surrounding biosphere. With multiple barriers, failure of one barrier will not endanger the isolation of spent nuclear fuel. Insoluble spent nuclear fuel will be stored in ironcopper canisters and placed in vertical tunnels within bedrock. Iron-copper canisters are surrounded with bentonite buffer to protect them from groundwater and from movements of the bedrock. MX-80 bentonite has been proposed to be used as a bentonite buffer in Finnish spent nuclear fuel repository. In a case of canister failure, bentonite buffer is expected to absorb and retain radionuclides originating from the spent nuclear fuel. If salinity of Olkiluoto island´s groundwater would decrease, chemical erosion of bentonite buffer could result in a generation of small particles called colloids. Under suitable conditions, these colloids could act as potential carriers for immobile radionuclides and transport them outside of facility area to the surrounding biosphere. Object of this thesis work was to study the effect of MX-80 bentonite colloids on radionuclide migration within two granitic drill core columns (VGN and KGG) by using two different radionuclides 134Cs and 85Sr. Batch type sorption and desorption experiments were conducted to gain information of sorption mechanisms of two radionuclides as well as of sorption competition between MX-80 bentonite colloids and crushed VGN rock. Colloids were characterized with scanning electron microscopy (SEM) and particle concentrations were determined with dynamic light scattering (DLS). Allard water mixed with MX-80 bentonite powder was used to imitate groundwater conditions of low salinity and colloids. Strontium´s breakthrough from VGN drill core column was found to be successful, whereas caesium did not breakthrough from VGN nor KGG columns. Caesium´s sorption showed more irreversible nature than strontium and was thus retained strongly within both columns. With both radionuclides, presence of colloids did not seem to enhance radionuclide´s migration notably. Breakthrough from columns was affected by both radionuclide properties and colloid filtration within tubes, stagnant pools and fractures. Experiments could be further complemented by conducting batch type sorption experiments with crushed KGG and by introducing new factors to column experiments. The experimental work was carried out at the Department of Chemistry, Radiochemistry in the University of Helsinki.