Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Title

Sort by: Order: Results:

  • Nurmi, Marisofia (2021)
    Globally, there is a constant shortfall of financial resources in conservation, which has partially been supplemented by combining conservation and conservation-compatible businesses. Many protected and conserved areas in sub-Saharan Africa are largely funded by revenues generated within the area, mainly through ecotourism. While ecotourism revenues are bringing in money into the system, dependency on this single type of revenue source is making conservation areas – or even the whole protected area system – vulnerable to changes in visitor numbers, which are prone to different political or socio-economic disturbances (such as conflicts, economic recession, and epidemics). A sudden substantial decrease in revenues or increase in costs may threaten the existence, extent, and quality of conservation areas in terms of biodiversity conservation. Collecting and analysing economic information on protected and conserved areas can help investigate their long-term sustainability and resilience to financial threats, such as the COVID-19 pandemic and related economic outcomes. In this thesis, I assess how conservation costs and revenues vary between different types of protected and conserved areas, how financially self-sufficient they are, and how economically resilient these areas may be in the face of global changes. The analysis is based on financial data from different types of protected and conserved areas in South Africa: state-owned national parks (South African National Parks, later SANParks), provincial parks (Ezemvelo KwaZulu-Natal Wildlife, later Ezemvelo) and private conserved areas. With the use of simulation modelling and resilience theory, I discuss how potential economic resilience varies between protected areas. The findings indicate that there are significant differences in the cost-revenue structure of different kinds of protected and conserved areas, and especially between public and private. Ezemvelo receives most of its funds from the provincial government, whereas SANParks covers the majority of its costs from tourism revenues. Private game reserves again need to cover their costs independently. According to the findings, size is an important attribute to predict the per hectare net income and running costs of public protected areas but has no significant influence on those of private game reserves. For public protected areas, the running costs per hectare are significantly higher for protected areas less than 1000 hectares. Based on the economic modelling and resilience theory, I concluded that private game reserves are generally financially more viable, but their vulnerability lies in their lack of embeddedness within a larger system (e.g., a conservation organization) that could support them during difficult times and require and encourage a long-term commitment to conservation. The economic resilience of public protected areas is more closely tied to the political atmosphere regarding conservation funding: self-generated revenues form only a part of the budgets of public protected areas. In addition, protected areas which have large fixed costs and depend on high tourism revenues are likely to be less economically resilient. Because of the higher running costs and resultant sensitivity of net income to changes in costs and revenues, parks that are home to the “Big Five” species (lion, leopard, rhino, elephant and buffalo) are in a more vulnerable position in the face of disturbances, as the pandemic. To address the threats that upcoming socio-economic disturbances pose to the funding base of protected and conserved areas, more focus should be given to the economic resilience of these areas, especially in countries and occasions where the areas rely on self-generated revenues.
  • Kurvinen, Pasi (Helsingin yliopistoUniversity of HelsinkiHelsingfors universitet, 2003)
  • Piekkola, Elina (2013)
    East Usambara Mountains situated in North-Eastern Tanzania, are globally recognized tropical forests that have a high biodiversity value. The Amani Nature Reserve encloses a high concentration of endemic species within versatile biodiversity. The aim of this study is to measure the ecotourism possibilities and potential of the Amani Nature Reserve and to provide sustainable areal development and livelihood option and outline the regional characteristic important in terms of ecotourism. The data for this study was gathered during a field trip in Tanzania, in January-March 2012, as part of an internship for WWF Finland, Coastal East Africa Initiative. The qualitative methods used included a structured questionnaire, semi-structured and in-depth interviews, field observation and literature analysis. Also, several discussions between different regional stakeholders were carried out. Six villages in the East Usambara Mountains are studied. The study concludes that the Amani Nature Reserve has high potential for ecotourism development and the area offers diverse nature related activities albeit the current visitor statistics are low. The overall results indicate the high value and possibilities with regional biodiversity, the locals' positive attitudes towards tourism but also the areal weaknesses; poor infrastructure, lack of facilities and services. The locals' willingness to cooperate and participate in ecotourism functions and existing cultural assets were also recognized. The Amani Nature Reserve's location, uniqueness and existing facilities strongly support the future ecotourism development. However the locals' knowledge on tourism impacts and conservation issues should be reinforced because there are currently multiple threats towards these tropical forests such as population growth and forest fragmentation. Ecotourism could reinforce forest conservation, local empowerment and sustainable livelihoods. In order to safeguard the ecotourism resource base, the environment, the ecotourism actions need to follow the ecotourism objectives and principles and consider different spatial environmental, social and economic characteristics. According to these principles the locals must be integrated in actions and decision-making processes at all levels and careful ecotourism planning, management and monitoring must take place. The ecotourism network development in Tanzania is highly possible because of the country's spectacular natural beauty and political stability. In order to safeguard the remaining life supporting wildlife also different stakeholders and locals should be engaged to work in cooperation seeking sustainable conservation means, such as ecotourism.
  • Häggblom, Svante (2019)
    Background: User experience (UX) is seen as an important quality of a successful product and software companies are becoming increasingly interested in the field of UX. As UX has the goal to improve the experience of users, there is a need for better methods in measuring the actual experience. One aspect of UX is to understand the emotional aspect of experience. Psychophysiology studies the relations between emotions and physiology and electrodermal activity (EDA) has been found to be a physiological measurement of emotional arousal. Aims: The aim of this thesis is researching the utility of measuring EDA to identify moments of emotional arousal during human-computer interaction. By studying peaks in EDA during software interaction we expect to find issues in the software that work as triggers or stimuli for the peaks. Method: We used the design science methodology to develop EDAMUX. EDAMUX is a method to unobtrusively observe users, while gathering significant interaction moments through self reporting and EDA. A qualitative single-case study was conducted to evaluate the utility of EDAMUX. Results: We found that we can discover causes of bad user experience with EDAMUX. Moments of emotional arousal, derived from EDA, was found in conjunction with performance issues, usability issues and bugs. Emotional arousal was also observed during software interaction where the user was blaming themself. Conclusions: EDAMUX shows potential in discovering issues in software that are difficult to find with methods that rely on subjective self-reporting. Having the potential to objectively study emotional reactions is seen as valuable in complementing existing methods of measuring user experience.
  • Shestovskaya, Jamilya (2020)
    Nowadays the number of connected devices is growing sharply. Mobile phones and other IoT devices are inherent parts of everyday life and used everywhere. The amount of data generated by IoT devices and mobile phones is enormous, which causes network congestions. In turn, the usage of centralized cloud architecture increases delay and cause jitter. To address those issues the research community discussed the new trend of decentralization – edge computing. There are different edge compute architectures suggested by various researchers. Some are more popular and supported by global companies. Most of those architectures have similarities. In this research, we reviewed seven edge compute architectures. This thesis is a comparative analysis carried out by using key attributes and presentation of the Venn diagram to select the right edge compute architecture.
  • Kovala, Jarkko (2020)
    Internet of Things (IoT) has the potential to transform many domains of human activity, enabled by the collection of data from the physical world at a massive scale. As the projected growth of IoT data exceeds that of available network capacity, transferring it to centralized cloud data centers is infeasible. Edge computing aims to solve this problem by processing data at the edge of the network, enabling applications with specialized requirements that cloud computing cannot meet. The current market of platforms that support building IoT applications is very fragmented, with offerings available from hundreds of companies with no common architecture. This threatens the realization of IoT's potential: with more interoperability, a new class of applications that combine the collected data and use it in new ways could emerge. In this thesis, promising IoT platforms for edge computing are surveyed. First, an understanding of current challenges in the field is gained through studying the available literature on the topic. Second, IoT edge platforms having the most potential to meet these challenges are chosen and reviewed for their capabilities. Finally, the platforms are compared against each other, with a focus on their potential to meet the challenges learned in the first part. The work shows that AWS IoT for the edge and Microsoft Azure IoT Edge have mature feature sets. However, these platforms are tied to their respective cloud platforms, limiting interoperability and the possibility of switching providers. On the other hand, open source EdgeX Foundry and KubeEdge have the potential for more standardization and interoperability in IoT but are limited in functionality for building practical IoT applications.
  • Sinikallio, Laura (2022)
    Parlamentaaristen aineistojen digitointi ja rakenteistaminen tutkimuskäyttöön on nouseva tutkimuksenala, jonka tiimoilta esimerkiksi Euroopassa on tällä hetkellä käynnissä useita kansallisia hankkeita. Tämä tutkielma on osa Semanttinen parlamentti -hanketta, jossa Suomen eduskunnan täysistuntojen puheenvuorot saatetaan ensimmäistä kertaa yhtenäiseksi, harmonisoiduksi aineistoksi koneluettavaan muotoon aina eduskunnan alusta vuodesta 1907 nykypäivään. Puheenvuorot ja niihin liittyvät runsaat kuvailutiedot on julkaistu kahtena versiona, parlamentaaristen aineistojen kuvaamiseen käytetyssä Parla-CLARIN XML -formaatissa sekä linkitetyn avoimen datan tietämysverkkona, joka kytkee aineiston osaksi laajempaa kansallista tietoinfrastruktuuria. Yhtenäinen puheenvuoroaineisto tarjoaa ennennäkemättömiä mahdollisuuksia tarkastella suomalaista parlamentarismia yli sadan vuoden ajalta monisyisesti ja automatisoidusti. Aineisto sisältää lähes miljoona erillistä puheenvuoroa ja linkittyy tiiviisti eduskunnan toimijoiden biografisiin tietoihin. Tässä tutkielmassa kuvataan puheenvuorojen esittämistä varten kehitetyt tietomallit ja puheenvuoroaineistojen keräys- ja muunnosprosessi sekä tarkastellaan prosessin ja syntyneen aineiston haasteita ja mahdollisuuksia. Toteutetun aineistojulkaisun hyödyllisyyden arvioimiseksi on Parla-CLARIN-muotoista aineistoa jo hyödynnetty poliittiseen kulttuuriin liittyvässä digitaalisten ihmistieteiden tutkimuksessa. Linkitetyn datan pohjalta on kehitetty semanttinen portaali, Parlamenttisampo, aineistojen julkaisemista ja tutkimista varten verkossa.
  • Katila, Nina (2020)
    Tutkielmassa tarkastellaan tieteellisen tutkimuksen ja ammattikirjallisuuden pohjalta keinoja ja näkökohtia tietojärjestelmien integraatiotestauksen automatisointiin. Tutkimusmetodologiana on tapaustutkimus (Case Study). Tutkimuksen tapausympäristönä on eduskunnan lainsäädäntötyön tietojärjestelmien välisen integraatiotestauksen automatisoinnin edellytykset ja eri toteutusvaihtoehdot. Tutkielmaa varten tietoa on kerätty eduskunnan tietojärjestelmien dokumentaatiosta sekä eduskunnan tietojärjestelmien asiantuntijoilta. Eduskunnan integraatiotestauksen työnkulut ja testauksen haasteet perustuvat havaintoihin, joita on tehty osallistumalla eduskunnan integraatiotestaukseen noin vuoden ajan. Automatisointivaihtoehtojen analysointi ja evaluointi perustuvat jatkuvaan yhteistyöhön eduskunnan lainsäädäntötyön tietojärjestelmien ja integraatiojärjestelmän asiantuntijoiden kanssa. Eduskunnan lainsäädäntötyön tietojärjestelmät ovat toiminnallisesti ja hallinnollisesti itsenäisiä sekä toteuttajiltaan, toteutukseltaan ja iältään erilaisia. Koska itsenäisten järjestelmien ohjelmakoodi ei ole järjestelmien välillä saatavilla, on integraatiotestauksen automatisoinnin ratkaisun perustuttava järjestelmien käyttöliittymien kautta saavutettavaan koodiin. Tutkimuksessa havaittiin, että ohjelmistorobotiikan (Robotic Process Automation, RPA) avulla voidaan jäljitellä eduskunnan testaajien järjestelmien käyttöliittymien kautta suorittamaa integraatiotestausta. Tutkimuksessa havaittiin, että testauksen automatisointiin ja ohjelmistorobotiikkaan soveltuvan automatisointikehyksen avulla on mahdollista automatisoida eduskunnan tietojärjestelmien integraatiotestaus. Ohjelmistorobotiikalla integraatiotestit saadaan suoritettua manuaalista testausta merkittävästi nopeammin ja vähemmillä resursseilla. Käyttöliittymiin perustuvan testausautomaation merkittävin haittapuoli on testien ylläpidon kustannukset. Modulaarisen avainsanapohjaisen automatisointikehyksen avulla voidaan tietojärjestelmien automatisoituja testejä ja niiden osia käyttää uudelleen integraatiotestauksen automatisoinnissa ja näin säästää kustannuksissa.
  • Vihko, Sami Vihko (2022)
    We will review techniques of perturbative thermal quantum chromodynamics (QCD) in the imaginary-time formalism (ITF). The Infrared (IR)-problems arising from the perturbative treatment of equilibrium thermodynamics of QCD and their phenomenological causes will be investigated in detail. We will also discuss the construction of two effective field theory (EFT) frameworks most often used in modern high precision calculations to overcome these. The EFTs are the dimensionally reduced theories EQCD and MQCD and Hard thermal loop effective theory (HTL). EQCD is three-dimensional Euclidean Yang-Mills theory coupled to an adjoint scalar field and MQCD is three-dimensional Euclidean pure Yang-Mills theory. The effective parameters in these theories are determined through matching calculations. HTL is based on resummation of hard thermal loops and uses effective propagators and vertex functions. We will also discuss the determination of the pressure of QCD perturbatively. In general, this thesis details calculations and the methodology.
  • Räsänen, Hannele (2020)
    Nowadays, with the influence of global economy large corporations use global software development to utilise advantages of geographically decentralised organisations and global outsourced software development. Through distributed organisations the work can be done around the clock. Global software development is impacted by three distance dimensions: time distance, geographical distance, and socio-cultural distance, which all bring some challenges. At the same time agile way of working has become more and more popular method in software development. As agile practises are created for co-located teams there is a demand for having working online solutions for communication and collaboration in distributed teams. Corporations use scaled agile way of working to support software develop-ment of large initiatives and projects. Scaled Agile Framework (SAFe) is the most popular among the scaled agile methods. This thesis was conducted as a case study in a multinational corporation. Objective of the case study was to research effectiveness of scaled agile methodology SAFe on communication and collaboration in teams and agile release trains. The case study included two parts: a web-survey and interviews. The results of the analyses of the case study support findings from the literature in the field. The results indicate the importance of communication and collaboration in agile practices and the significance of the online tools that support it.
  • Martinmäki, Petri (2013)
    The main purpose of this master's thesis is to present experiences of test automation in an industrial case, and to make recommendations of the best practices for the case. The recommendations are based on successful test automation stories found from the existing literature. The main issues that are hindering test automation seem to be similar in the case example and in the literature. The cost of implementation and maintenance combined with unrealistic expectations are perceived in almost every project. However, in the most successful projects, they have put a lot of effort to plan and implement maintainable sets of automatic tests. In conclusion, the evidence from the literature shows that successful test automation needs investments - especially in the beginning of the project. A few specific best practices are adapted to the case project, and presented in the form that they could be applied.
  • Mäkelä, Ville (2022)
    With their ability to convert chemical energy to electrical energy through electrochemical reactions, rechargeable batteries are widely used to store energy in various applications such as electronic mobile equipment, aerospace aviation, road transportation, power grid, and national defense industry Numerous battery types are available commercially. Lithium ion-based batteries stand out due to several key advantages such as high operating voltage, high specific energy, and long cycle life. They also have a market dominance in a wide range of electric vehicles. However, like all battery technologies, lithium ion-based ones suffer from the effects of aging-induced degradation which can lead to reduced capacity, lifetime, and in some cases even safety hazards. One method of preventing/slowing down these aging reactions is to modify the standard battery materials by using dopants and additives. They are specific impurities purposely introduced into the battery during the manufacturing process. In this master’s thesis, the effect of additives (Mg/Al) on the aging of Li-ion cells was examined by using X-ray absorption spectroscopy, more specifically x-ray absorption near edge structure (XANES). For the experiment, 7 different cells, all containing lithium cobalt oxide as the major component (with 4 having a stoichiometric ration of Li/Co, and 3 being Li-rich), with 5 of them containing Mg/Al as dopants, and 2 containing no dopants were examined using XANES as a function of aging in terms of charge/discharge cycles. The dopants were introduced at different stages of the material preparation, either at the lithiation step or at the synthesis of the precursor. This thesis focuses on the XANES experiment and the data analysis, with extensive literature review on the topic of using additives and dopants. The cells were prepared by the Aalto University. The results showed that of the cells with dopant materials, the cells doped during lithiation stage aged slightly better after cycling than the undoped ones, whereas the cells doped during precursor stage aged worse than the undoped cells. This would suggest that doping might be more effective when done during the lithiation stage.
  • Suorsa, Matti Valtteri (2017)
    In Finland, the spent nuclear fuel will be deposited at a depth of 400 m in the granitic bedrock. The disposal is based on KBS-3 concept, which relies on the multi-barrier principle, where different successive barriers prevent the migration of radionuclides to biosphere. The spent nuclear fuel is placed in the disposal tunnels in copper-iron canisters, which are surrounded by bentonite clay to insulate them from the groundwater flow and protect from the movements of the bedrock. Bentonite clay consists mainly of montmorillonite, which like the other aluminosilicates are known to retain radionuclides thus, contributing to the retention or immobilization of them. Besides the contribution to the multi-barrier system, the bentonite buffer is assumed to be a potential source of colloids due to the erosion of bentonite in certain conditions. Colloids in the context of radionuclide migration are nanoparticles in the size range from 1 to 1000 nm that remain suspended in water. The montmorillonite colloids could potentially act as carriers for otherwise immobile radionuclides like transuranium elements in the case of canister failure. Especially, 241Am is an important radionuclide regarding the long-term safety of the final disposal as after a few hundred years 241Am and its mother 241Pu contribute most to the radiotoxicity of the spent nuclear fuel. The relevance of the colloids to the long-term performance is depending on several factors like colloid stability, mobility and their interaction with radionuclides. The colloid stability is depending on the groundwater conditions like ionic strength and pH. In low salinity groundwaters, the montmorillonite colloids have been shown to be stable. On the other hand, the collective processes of the rock matrix, bentonite colloids and radionuclides have to be investigated to assess the long-term performance of the multi-barrier system. It requires the combination of the different scale experiments from the simple laboratory experiments to large, natural scale in-situ experiments to understand the complex processes affecting the colloid-facilitated radionuclide migration. The large-scale laboratory experiments conducted with granite blocks offer an intermediate between the two extremes having a more natural system than the former and a better controllability than the latter. In this study, the radionuclide migration was studied in different scale laboratory experiments. The colloid-facilitated transport of Eu was studied with a block-scale experiment using a granite block with a natural water conducting fracture. The suitability of the block was assessed by conducting several experiments using different non-sorbing and sorbing tracer and montmorillonite colloids separated from synthetic Ni-labeled montmorillonite and Nanocor PGN Montmorillonite (98 %). Laser-induced breakdown detection (LIBD), photon correlation spectroscopy (PCS) and ICP-/MP-OES were utilized in colloid detection. Supportive batch experiments were conducted to study the colloid stability in different ground waters and the interaction between the granite, different montmorillonite colloids and Eu, an analog to Am. Good reproducibility was obtained with non-sorbing tracers. The breakthrough of the radioactive 3H, 36Cl and fluoresceine and Amino-G dyes showed similar behavior. On the other hand, no breakthrough of montmorillonite colloids or 152Eu occurred. Based on the literature review, the low flow rates used could be the reason for this. Low flow rate (50 μl/min) could affect the colloid mobility strongly which could explain that Eu retained in the fracture. More experiments with higher flow velocities would be required. Different montmorillonite materials showed similar but not exact the same sorption behavior of Eu. The fraction of Eu attached to colloids decreased during the experiments and correspondingly the fraction attached to the granite increased. At the same time, colloids remained stable during the expertiments. This indicates that desorption of Eu from the colloids is taking place in the presence of granite. Also, the effect of different water composition on the stability of colloids was clearly seen on the preparation of colloid suspensions in different water simulants. Even a small increase in the ionic strength of the solution made the especially Ni-montmorillonite colloids instable.
  • Lehtinen, Simo (2021)
    The solar corona constantly emits a flow of charged particles, called the solar wind, into interplanetary space. This flow is diverted around the Earth by the magnetic pressure of the Earth’s own geomagnetic field, shielding the Earth from the effect of this particle radiation. On occasion the Sun ejects a large amount of plasma outwards from the corona in an event called a Coronal Mass Ejection (CME). Such events can drive discontinuities in the solar wind plasma, called interplanetary shocks. Shocks can affect the Earth’s magnetosphere, compressing it inwards and generating electromagnetic waves inside it. In this thesis we will cover a study of the ultra-low frequency (ULF) wave response in the magnetosphere to CME-driven shocks. Geomagnetic pulsations are ultra-low frequency plasma waves in the magnetosphere, observable from ground-based magnetometers. The compression of the magnetosphere by interplanetary shocks generates geomagnetic pulsations in the Pc4 and Pc5 frequency ranges (2 - 22 mHz). These waves play an important role in magnetospheric dynamics and the acceleration and depletion of high energy electrons in the radiation belts. We consider 39 interplanetary shock events driven by CMEs, and analyse ground-based magnetometer data from stations located near local noon at the time of the shock arrival. Solar wind measurements are used to categorise interplanetary shocks based on their Mach number and the dynamic pressure differential as main indicators of shock strength. The importance of these parameters in determining the strength of the wave response in the geomagnetic field is then studied using wavelet analysis and superposed epoch analysis. Stronger shocks are found to result in larger increases in wave activity, especially in the Pc4 range. Ground stations at higher latitudes observe higher wavepower, but there is an interesting anomaly in the Pc4 range at stations magnetically connected to regions near the plasmapause, which show an enhanced wavepower response. We quantify the decay time of the wave activity and find that it is around 20 hours for Pc5 waves and 7 hours for Pc4 waves.
  • Pakkanen, Noora (2021)
    In Finland, the final disposal of spent nuclear fuel will start in the 2020s where spent nuclear fuel will be disposed 400-450 meters deep into the crystalline bedrock. Disposal will follow Swedish KBS-3 principle where spent nuclear fuel canisters will be protected by multiple barriers, which have been planned to prevent radionuclides´ migration to the surrounding biosphere. With multiple barriers, failure of one barrier will not endanger the isolation of spent nuclear fuel. Insoluble spent nuclear fuel will be stored in ironcopper canisters and placed in vertical tunnels within bedrock. Iron-copper canisters are surrounded with bentonite buffer to protect them from groundwater and from movements of the bedrock. MX-80 bentonite has been proposed to be used as a bentonite buffer in Finnish spent nuclear fuel repository. In a case of canister failure, bentonite buffer is expected to absorb and retain radionuclides originating from the spent nuclear fuel. If salinity of Olkiluoto island´s groundwater would decrease, chemical erosion of bentonite buffer could result in a generation of small particles called colloids. Under suitable conditions, these colloids could act as potential carriers for immobile radionuclides and transport them outside of facility area to the surrounding biosphere. Object of this thesis work was to study the effect of MX-80 bentonite colloids on radionuclide migration within two granitic drill core columns (VGN and KGG) by using two different radionuclides 134Cs and 85Sr. Batch type sorption and desorption experiments were conducted to gain information of sorption mechanisms of two radionuclides as well as of sorption competition between MX-80 bentonite colloids and crushed VGN rock. Colloids were characterized with scanning electron microscopy (SEM) and particle concentrations were determined with dynamic light scattering (DLS). Allard water mixed with MX-80 bentonite powder was used to imitate groundwater conditions of low salinity and colloids. Strontium´s breakthrough from VGN drill core column was found to be successful, whereas caesium did not breakthrough from VGN nor KGG columns. Caesium´s sorption showed more irreversible nature than strontium and was thus retained strongly within both columns. With both radionuclides, presence of colloids did not seem to enhance radionuclide´s migration notably. Breakthrough from columns was affected by both radionuclide properties and colloid filtration within tubes, stagnant pools and fractures. Experiments could be further complemented by conducting batch type sorption experiments with crushed KGG and by introducing new factors to column experiments. The experimental work was carried out at the Department of Chemistry, Radiochemistry in the University of Helsinki.
  • Jussila, Joonas (2019)
    In this thesis, the sputtering of tungsten surfaces is studied under ion irradiation using molecular dynamics simulations. The focus of this work is on the effect of surface orientation and incoming angle on tungsten sputtering yields. We present a simulation approach to simulate sputtering yields of completely random surface orientations. This allows obtaining the total sputtering yields averaged over a large number of arbitrary surface orientations, which are representative to the sputtering yield of a polycrystalline sample with random grain orientations in a statistically meaningful way. In addition, a completely different method was utilised to simulate sputtering yields of tungsten fuzz surfaces with various fuzz structure heights. We observe that the total sputtering yield of the investigated surfaces are clearly dependent on the surface orientation and the sputtering yields of average random surfaces are different compared to the results of any of the low index surfaces or their averages. The low index surfaces and the random surface sputtering yields also show a dependence of the incoming angle of the projectile ions. In addition, we calculate the outgoing angular distribution of sputtered tungsten atoms in every bombardment case, which likewise shows to be a sensitive to the surface orientation. Finally, the effect of fuzz height on the sputtering yield of tungsten fuzz surfaces is discussed. We see that tungsten fuzz significantly reduces the sputtering yield compared to a pristine tungsten surface and the effect is already seen when the fuzz pillar height is a few atom layers.
  • Mäntysaari, Matti (2021)
    This thesis describes the data and data analysis concerning Compton scattering experiments to obtain the Compton profiles of metallic sodium (Na) as a function of temperature. The temperatures used in the experiment were 6 K and 300 K. The purpose of the work was to visualize the effect of temperature in the electron momentum density in a free electron gas. The effects of temperature were expected to be manifested through changes to the Fermi momentum according to the free-electron theory, but also more subtle changes could have been possible owing to possible deviations from the free electron theory. The measurements were done at the European Synchrotron Radiation Facility (Grenoble, France) beamline ID20. The data was analyzed with a help of a program written with Matlab, and it converted the measured Compton spectra from photon energy space to electron momentum space, while applying self-absorption corrections to the data, subtracting background, and normalizing the data using trapezoidal numerical integral to yield final Compton profiles. Results were obtained as valence Compton profiles and their differences between 300 K and 6 K, and compared with the prediction from free-electron gas theory. The Compton profiles followed the predictions of the free-electron gas theory well, although the theoretical profiles had a higher amplitude than the measured profiles. This is a commonly found phenomenon in Compton spectroscopy and assigned to originate from electron-electron correlations. The effect of the temperature in the Compton profiles is in good agreement with the free-electron theory.
  • Oyekoya, Gboyega Nathaniel (2018)
    Methods for mitigating climate change have been researched for decades. It has been established that most of greenhouse gases (GHG) such as carbon dioxide (CO2), nitrous oxide (N2O), methane (CH4) are due to anthropogenic activities. Significant amount of these gases, particularly N2O and CH4 are emitted due to agricultural practices. To boost food supply, nitrogenous fertilizers are the most common fertilizers used in farming, and as the global population will increase to 9 billion by 2050, it is crucial to keep pace with both the population growth and potential economic development. Applying biochar in soil has been regarded as a potential approach to combat climate change while boosting food supplies. In this study, the effect of two different biochars mixed with 15N-labelled ammonium nitrate ( either nitrate NH415NO3, or 15NH4NO3 ammonium labelled) fertilizers in growing Meroa Tetraploid Italian Ryegrasses (Lolium multiflorum) were investigated in Greenhouse Viikki Campus Greenhouse, University of Helsinki, Finland. We monitored the impact of biochars on the aboveground biomasses (g pot-1), roots biomasses (g pot-1), and total biomasses (gram/pot). Furthermore, retention of leachates (nutrient ions) such as atom percentage (at%) 15N-labelled ammonium ion (at% 15NH4-N), mg NH4-N, at% 15N -labelled nitrate ions (at% 15NO3-N %), and mg NO3-N were investigated as well. Lastly, GHG fluxes of N2O-N (ug Kg-1 soil hr-1), CH4-C (ug Kg-1 soil hr-1), CO2-C (mg Kg-1 soil hr-1) were also included in our research for a period of one month in comparison with control samples and fertilized control. Leachates were collected once a week, and gas samplings were measured twice each day for two days in a week. Gases sampled, and leachates collected were analysed, and the plants were detached to obtain the amount of aboveground biomasses (leaves), roots biomasses (roots) and total biomasses (leaves + roots). Our objective affirmed a more eco-friendlly model that biochars can reduce nitrogenous fertilizers applied with an increased agricultural produce. It was concluded that, the effects of different biochars on these mentioned attributes are distinguishable.
  • Mozejko, Arik (2023)
    Dark matter (DM) is introduced and explored in a holistic perspective. Topics include observational evidence, various DM properties, potential candidates, and the tenets of indirect versus direct DM detection. Then an emphasis is placed on understanding the cryogenic detection of weakly interacting massive particles, with explicit connection to phonon-based detection of DM. The importance of improving methods of DM direct detection are emphasised, with specifically the usage of molecular dynamics simulations as an avenue of studying defect creation in cryogenic detector materials. Previous investigations into this area are reviewed and expanded upon through novel experimentation into how defect properties vary when changing thermal motion of the crystal lattice. This experimentation is conducted via the usage of molecular dynamics simulations on sapphire (Al2O3) as a DM direct detection material, and it is found that while atomic velocity does not impact the overall emergent defect structure, it does have an impact on the energy lost in these defects. Changing the temperature of the lattice produces the expected results, generating greater variance in both defect band structure as well as average energy loss.
  • Lübbers, Henning (2012)
    General-purpose lossless data compression continues to be an important aspect of the daily use of computers, and a multitude of methods and corresponding compression programs has emerged since information theory was established as a distinct area of research by C.E. Shannon shortly after World War II. Shannon and others discovered several theoretical bounds of data compression and it is, for instance, nowadays known that there can be neither a compressor that can compress all possible inputs, nor any mechanism that compresses at least some inputs while preserving the lengths of the incompressible ones. Although it is therefore indisputable that any compressor must necessarily expand some inputs, it is nonetheless possible to limit the expansion of any input to a constant number of bits in worst case. In order to determine how the established theoretical bounds relate to existing compression programs, I examined two popular compressors, GZip and BZip2, and concluded that their behaviour is not optimal in all respects as they may expand inputs in worst case more than theoretically necessary. On the other hand, the examined programs provide very good compression for most realistic inputs rather swiftly, a characteristic that is most likely appreciated by most computer users. Motivated by a review of the most fundamental bounds of data compression, i.e. Kolmogorov Complexity, Entropy and the Minimum Description Length Principle, and further encouraged by our analysis of GZip and BZip2, I propose a generic, pipelined architecture in this thesis that can — at least in theory - be employed to achieve optimal compression in two passes over the input to be compressed. I subsequently put the proposed architecture directly to the test, and use it to substantiate my claim that the performance of compression (boosting) methods can be improved if they are configured for each input anew with a dynamically discovered set of optimal parameters. In a simple empirical study I use Huffman Coding, a classic entropy-based compression method, as well as Move-To-Front Coding (MTF), a compression boosting method designed to exploit locality among source symbols, to demonstrate that the choice of implied source alphabet influences the achieved compression ratio and that different test inputs require different source alphabets to achieve optimal compression.