Browsing by discipline "none"
Now showing items 21-40 of 270
-
(2020)Automatic readability assessment is considered as a challenging task in NLP due to its high degree of subjectivity. The majority prior work in assessing readability has focused on identifying the level of education necessary for comprehension without the consideration of text quality, i.e., how naturally the text flows from the perspective of a native speaker. Therefore, in this thesis, we aim to use language models, trained on well-written prose, to measure not only text readability in terms of comprehension but text quality. In this thesis, we developed two word-level metrics based on the concordance of article text with predictions made using language models to assess text readability and quality. We evaluate both metrics on a set of corpora used for readability assessment or automated essay scoring (AES) by measuring the correlation between scores assigned by our metrics and human raters. According to the experimental results, our metrics are strongly correlated with text quality, which achieve 0.4-0.6 correlations on 7 out of 9 datasets. We demonstrate that GPT-2 surpasses other language models, including the bigram model, LSTM, and bidirectional LSTM, on the task of estimating text quality in a zero-shot setting, and GPT-2 perplexity-based measure is a reasonable indicator for text quality evaluation.
-
(2020)Global warming is expected to have detrimental consequences on fragile ecosystems in the tropics and to threaten both the global biodiversity as well as food security of millions of people. Forests have the potential to buffer the temperature changes, and the microclimatic conditions below tree canopies usually differ substantially from the ambient macroclimate. Trees cool down their surroundings through several biophysical mechanisms, and the cooling benefits occur also with trees outside forest. Remote sensing technologies offer new possibilities to study how tree cover affects temperatures both in local and regional scales. The aim of this study was to examine canopy cover’s effect on microclimate and land surface temperature (LST) in Taita Hills, Kenya. Temperatures recorded by 19 microclimate sensors under different canopy covers in the study area and LST estimated by Landsat 8 thermal infrared sensor (TIRS) were studied. The main interest was in daytime mean and maximum temperatures measured with the microclimate sensors in June-July 2019. The Landsat 8 imagery was obtained in July 4, 2019 and LST was retrieved using the single-channel method. The temperature records were combined with high-resolution airborne laser scanning (ALS) data of the area from years 2014 and 2015 to address how topographical factors and canopy cover affect temperatures in the area. Four multiple regression models were developed to study the joint impacts of topography and canopy cover on LST. The results showed a negative linear relationship between daytime mean and maximum temperatures and canopy cover percentage (R2 = 0.6–0.74). Any increase in canopy cover contributed to reducing temperatures at all microclimate measuring heights, the magnitude being the highest at soil surface level. The difference in mean temperatures between 0% and 100% canopy cover sites was 4.6–5.9 ˚C and in maximum temperatures 8.9–12.1 ˚C. LST was also affected negatively by canopy cover with a slope of 5.0 ˚C. It was found that canopy cover’s impact on LST depends on altitude and that a considerable dividing line existed at 1000 m a.s.l. as canopy cover’s effect in the highlands decreased to half compared to the lowlands. Based on the results it was concluded that trees have substantial effect on both microclimate and LST, but the effect is highly dependent on altitude. This indicates trees’ increasing significance in hot environments and highlights the importance of maintaining tree cover particularly in the lowland areas. Trees outside forests can increase climate change resilience in the area and the remaining forest fragments should be conserved to control the regional temperatures.
-
(2020)The aim of this thesis is to explore applications of machine learning to the study of asteroid spectra, and as such, its research question can be summarized as: How can asteroid spectra be analyzed using machine learning? The question is explored through evaluation of the obtained solutions to two tasks: the optimal locations of spectrophotometric filters for asteroid classification success and the formation of an asteroid taxonomy through unsupervised clustering. First, background theory for asteroids and particularly spectroscopy of asteroids is presented. Next, the theory of machine learning is briefly discussed, including a focus on the method utilized to solve the first task: neural networks. The first task is executed by developing an optimization algorithm that has access to a neural network that can determine the classification success rate of data samples that would be obtained using spectrophotometric filters at specific locations within the possible wavelength range. The second task, on the other hand, is evaluated through determining the optimal number of clusters for the given dataset and then developing taxonomies with the clustering algorithm k-means. The obtained results for the first task involving the optimal locations of filters for spectrophotometry seem reliable, and correlate relatively well with well-known mineralogical features on asteroid surfaces. The taxonomic systems developed by the unsupervised clustering also succeeded rather well, as many of the formed clusters seem to be meaningful and follow the trends in other asteroid taxonomies. Therefore, it seems that based on the two investigated tasks, machine learning can be applied well to asteroid spectroscopy. For future studies, larger datasets would be required for improving the overall reliability of the results.
-
(2020)In this thesis we will look at the asymptotic approach to modeling randomly weighted heavy-tailed random variables and their sums. The heavy-tailed distributions, named after the defining property of having more probability mass in the tail than any exponential distribution and thereby being heavy, are essentially a way to have a large tail risk present in a model in a realistic manner. The weighted sums of random variables are a versatile basic structure that can be adapted to model anything from claims over time to the returns of a portfolio, while giving the primary random variables heavy-tails is a great way to integrate extremal events into the models. The methodology introduced in this thesis offers an alternative to some of the prevailing and traditional approaches in risk modeling. Our main result that we will cover in detail, originates from "Randomly weighted sums of subexponential random variables" by Tang and Yuan (2014), it draws an asymptotic connection between the tails of randomly weighted heavy-tailed random variables and the tails of their sums, explicitly stating how the various tail probabilities relate to each other, in effect extending the idea that for the sums of heavy-tailed random variables large total claims originate from a single source instead of being accumulated from a bunch of smaller claims. A great merit of these results is how the random weights are allowed for the most part lack an upper bound, as well as, be arbitrarily dependent on each other. As for the applications we will first look at an explicit estimation method for computing extreme quantiles of a loss distributions yielding values for a common risk measure known as Value-at-Risk. The methodology used is something that can easily be adapted to a setting with similar preexisting knowledge, thereby demonstrating a straightforward way of applying the results. We then move on to examine the ruin problem of an insurance company, developing a setting and some conditions that can be imposed on the structures to permit an application of our main results to yield an asymptotic estimate for the ruin probability. Additionally, to be more realistic, we introduce the approach of crude asymptotics that requires little less to be known of the primary random variables, we formulate a result similar in fashion to our main result, and proceed to prove it.
-
(2020)Resonant inelastic X-ray scattering (RIXS) is one of the most powerful synchrotron based methods for attaining information of the electronic structure of materials. Novel ultra-brilliant X-ray sources, X-ray free electron lasers (XFEL), offer new intriguing possibilities beyond the traditional synchrotron based techniques facilitating the transition of X-ray spectroscopic methods to the nonlinear intensity regime. Such nonlinear phenomena are well known in the optical energy range, less so in X-ray energies. The transition of RIXS to the nonlinear region could have significant impact on X-ray based materials research by enabling more accurate measurements of previously observed transitions, allowing the detection of weakly coupled transitions on dilute samples and possibly uncovering completely unforeseen information or working as a platform for novel intricate methods of the future. The nonlinear RIXS or stimulated RIXS (SRIXS) on XFEL has already been demonstrated in the simplest possible proof of concept case. In this work a comprehensive introduction to SRIXS is presented from a theoretical point of view starting from the very beginning, thus making it suitable for anyone with the basic understanding of quantum mechanics and spectroscopy. To start off, the principles of many body quantum mechanics are revised and the configuration interactions method for representing molecular states is introduced. No previous familiarity with X-ray matter interaction or RIXS is required as the molecular and interaction Hamiltonians are carefully derived, based on which a thorough analysis of the traditional RIXS theory is presented. In order to stay in touch with the real world, the basic experimental facts are recapped before moving on to SRIXS. First, an intuitive picture of the nonlinear process is presented shedding some light onto the term \textit{stimulated} while introducing basic terminology and some X-ray pulse schemes along with futuristic theoretical examples of SRIXS experiments. After this, a careful derivation of the Maxwell-Liouville-von Neumann theory up to quadrupole order is presented for the first time ever. Finally, the chapter is concluded with a short analysis of the experimental status quo on XFELs and some speculation on possible transition metal samples where SRIXS in its current state could be applied to observe quadrupole transitions advancing the field remarkably.
-
(2020)This thesis presents the Atmospherically Relevant Chemistry and Aerosol Box Model (ARCA box), which is used for simulating atmospheric chemistry and the time evolution of aerosol particles and the formation of stable molecular clusters. The model can be used for example in solving of the concentrations of atmospheric trace gases formed from some predefined precursors, simulation and design of smog chamber experiments or indoor air quality estimation. The backbone of ARCAs chemical library comes from Master Chemical Mechanism (MCM), extended with Peroxy Radical Autoxidation Mechanism (PRAM), and is further extendable with any new reactions. Molecular clustering is simulated with the Atmospheric Cluster Dynamics Code (ACDC). The particle size distribution is represented with two alternative methods whose size and grid density are fully configurable. The evolution of the particle size distribution due to the condensation of low volatile organic vapours and the Brownian coagulation is simulated using established kinetic and thermodynamic theories. The user interface of ARCA differs considerably from the previous comparable models. The model has a graphical user interface which improves its usability and repeatability of the simulations. The user interface increases the potential of ARCA being used also outside the modelling community, for example in the experimental atmospheric sciences or by authorities.
-
(2019)Ion beams have been the subject of significant industry interest since the 1950s. They have gained usage in many fields for their ability to modify material properties in a controlled manner. Most important has been the application to semiconductor devices such as diodes and transistors, where the necessary doping is commonly achieved by irradiation with appropriate ions, allowing the development of the technology that we see in everyday use. With the ongoing transition to ever smaller semiconductor devices, the precision required of the manufacturing process correspondingly increases. A strong suite of modeling tools is therefore needed to advance the understanding and application of ion beam methods. The binary collision approximation (BCA) as a simulation tool was first introduced in the 1950s. It allows the prediction of many radiation-related phenomena for single collision cascades, and has been adopted in many experimental laboratories and industries due to its efficiency. However, it fails to describe chemical and thermodynamic effects, limiting its usefulness where ballistic effects are not a sufficient description. Parallel to BCA, the molecular dynamics (MD) simulation algorithm was developed. It allows a more accurate and precise description of interatomic forces and therefore chemical effects. It is, however, orders of magnitude slower than the BCA method. In this work, a new variant of the MD algorithm is developed to combine the advantages of both the MD and the BCA methods. The activation and deactivation of atoms involved in atomic cascades is introduced as a way to save computational effort, concentrating the performed computations in the region of interest around the cascade and ignoring surrounding equilibrium regions. By combining this algorithm with a speedup scheme limiting the number of necessary relaxation simulations, a speedup of one order of magnitude is reached for covalent materials such as Si and Ge, for which the algorithm was validated. The developed algorithm is used to explain the behavior of Ge nanowires under Xe ion irradiation. The nanowires were shown experimentally to bend towards or away from the ion beam, and computational simulations might help with the understanding of the underlying physical processes. In this thesis, the high-fluence irradiation of a Ge nanowire is simulated and the resulting defect structure analyzed to study the bending, doubling as a second test of the developed algorithm.
-
(2019)Background: Electroencephalography (EEG) depicts electrical activity in the brain, and can be used in clinical practice to monitor brain function. In neonatal care, physicians can use continuous bedside EEG monitoring to determine the cerebral recovery of newborns who have suffered birth asphyxia, which creates a need for frequent, accurate interpretation of the signals over a period of monitoring. An automated grading system can aid physicians in the Neonatal Intensive Care Unit by automatically distinguishing between different grades of abnormality in the neonatal EEG background activity patterns. Methods: This thesis describes using support vector machine as a base classifier to classify seven grades of EEG background pattern abnormality in data provided by the BAby Brain Activity (BABA) Center in Helsinki. We are particularly interested in reconciling the manual grading of EEG signals by independent graders, and we analyze the inter-rater variability of EEG graders by building the classifier using selected epochs graded in consensus compared to a classifier using full-duration recordings. Results: The inter-rater agreement score between the two graders was κ=0.45, which indicated moderate agreement between the EEG grades. The most common grade of EEG abnormality was grade 0 (continuous), which made up 63% of the epochs graded in consensus. We first trained two baseline reference models using the full-duration recording and labels of the two graders, which achieved 71% and 57% accuracy. We achieved 82% overall accuracy in classifying selected patterns graded in consensus into seven grades using a multi-class classifier, though this model did not outperform the two baseline models when evaluated with the respective graders’ labels. In addition, we achieved 67% accuracy in classifying all patterns from the full-duration recording using a multilabel classifier.
-
(2020)Real estate appraisal, or property valuation, requires strong expertise in order to be performed successfully, thus being a costly process to produce. However, with structured data on historical transactions, the use of machine learning (ML) enables automated, data-driven valuation which is instant, virtually costless and potentially more objective compared to traditional methods. Yet, fully ML-based appraisal is not widely used in real business applications, as the existing solutions are not sufficiently accurate and reliable. In this study, we introduce an interpretable ML model for real estate appraisal using hierarchical linear modelling (HLM). The model is learned and tested with an empirical dataset of apartment transactions in the Helsinki area, collected during the past decade. As a result, we introduce a model which has competitive predictive performance, while being simultaneously explainable and reliable. The main outcome of this study is the observation that hierarchical linear modelling is a very potential approach for automated real estate appraisal. The key advantage of HLM over alternative learning algorithms is its balance of performance and simplicity: this algorithm is complex enough to avoid underfitting but simple enough to be interpretable and easy to productize. Particularly, the ability of these models to output complete probability distributions quantifying the uncertainty of the estimates make them suitable for actual business use cases where high reliability is required.
-
(2020)Koulumatematiikka on ottanut valtavia harppauksia eteenpäin viime vuosien aikana. Vuonna 2014 hyväksytty perusopetuksen opetussuunnitelman perusteet toivat ohjelmoinnin osaksi matematiikan opetust. Opetussuunnitelmassa painotetaan paljon yleisesti sekä matematiikan osalta tieto- ja viestintäteknologian hyödyntämistä osana opetusta. Teknologia antaa runsaasti mahdollisuuksia työskentelyyn koulussa. Avaruusgeometriaa voi luoda, havainnoida tai tutkia siihen tarkoitetuilla sovelluksilla. Valmiiksi tehtyjä käyttökelpoisia appletteja löytyy esimerkiksi GeoGebraltaa hyvin. Tästä voi olettaa, että oppikirjat antavat oppilaille paljon tukea teknologian hyödyntämiseen opinnoissa. Koulussa käytettävät oppimateriaalit muuttuivat viimeisimmän opetussuunnitelman myötä. Esimerkiksi materiaalit voivat olla digitaalisessa muodossa perinteisen kirjan sijaan. Perinteisissä oppikirjoissa on aikaisempien kirjojen tyyliin tehtäviä laidasta laitaan. Avaruusgeometriassa tehtävät painottuvat hahmottamiseen ja laskemiseen, mutta myös kolmiulotteiseen hahmottamiseen liittyviä tehtäviä on jonkin verran. Tieto- ja viestintäteknologiaa ei ole oppilaiden kirjoissa mainittuna tai tuotu esiin. Ylioppilastutkinnossa matematiikka on suoritettu keväästä 2019 alkaen sähköisenä kokeena. Peruskoulusta lukioon jatkaa yli puolet oppilaista, joten yläkoulun jälkeen oppilailla on muutama vuosi lukiossa aikaa omaksua erilaiset sähköiset työkalut. Lukiossa opiskeleville olisivaltavasti etua, jos perusopetuksessa tutustuttaisiin esimerkiksi geometrisiin piirtotyökaluihin. Nivelvaihe peruskoulun ja lukion välillä on joka tapauksessa suuri harppaus. Perusopetuksen matematiikan opetusmateriaaleissa ei ole hyödynnetty teknologiaa avaruusgeometriassa. Lisätutkimus teknologian integroimisesta osaksi matematiikan ja avaruusgeometrian opetusta olisi toivottavaa.
-
(2019)Tässä tutkimuksessa luodaan yleiskatsaus babylonialaiseen matematiikkaan, perehdytään sen saavutuksiin ja erityispiirteisiin ja pohditaan sen suurimpia ansioita. Lisäksi selvitetään miten babylonialainen matematiikka on vaikuttanut matematiikan kehitykseen ja miten babylonialaiset keksinnöt ovat päätyneet erityisesti kreikkalaisten matemaatikoiden käyttöön. Babylonialaisen matematiikan lisäksi tutkitaan myös babylonialaista astronomiaa soveltuvin osin. Tutkimuksessa selvitetään myös onko babylonialaisella matematiikalla yhteyksiä nykyaikaan ja erityisesti tapaan jakaa tunti 60 minuuttiin ja minuutti 60 sekuntiin ja ympyrän kehäkulma 360 asteeseen. Tutkimus toteutettiin kirjallisuuskatsauksena käyttämällä mahdollisimman laajasti sekä babylonialaista matematiikkaa koskevia perusteoksia että uusimpia artikkeleita. Matemaattisten saavutusten siirtymistä lähestyttiin tutkimalla tunnettuja kreikkalaisen matematiikan ja astronomian keskeisiä henkilöitä ja heidän yhteyksiään babylonialaiseen matematiikkaan. Näiden pohjalta muodostettiin yhteneväinen kokonaisuus babylonialaisen matematiikan saavutuksista ja tiedon siirtymisestä. Babylonialainen matematiikka käytti omaperäistä ja edistyksellistä seksagesimaalijärjestelmää, jonka kantaluku oli 60 ja joka oli ensimmäinen tunnettu numeroiden paikkajärjestelmä. Babylonialaisia matemaatikoita voidaan perustellusti sanoa antiikin parhaiksi laskijoiksi. He tunsivat monia tunnettuja lauseita kuten Pythagoraan lauseen ja Thaleen lauseen, osasivat ratkaista toisen asteen yhtälön ja käyttivät erilaisia tehokkaita algoritmeja likiarvojen laskemiseen yli tuhat vuotta ennen kreikkalaisia. Kreikkalaisten ensimmäisinä matemaatikkoina pitämät Thales ja Pythagoras oppivat ilmeisesti tunnetuimmat tuloksensa babylonialaisilta ja heidän merkityksensä on ensisijaisesti tiedon kuljettajana ja matematiikan eri osasten järjestelijöinä. Babylonialainen astronomia oli edistyksellistä ja kreikkalainen Hipparkhos hyödynsi babylonialaisten tekemien havaintojen lisäksi myös babylonialaista laskutapaa tehdessään omia tutkimuksiaan. Näiden ratkaisujen pohjalta ympyrä jaetaan vielä nykyäänkin 360 asteeseen, joista jokainen aste jakautuu 60 osaan. Samalla babylonialaiseen matematiikkaan perustuvalla periaatteella myös tunnit ja minuutit on jaettu 60 osaan.
-
(2019)Tailoring a hybrid surface or any complex material to have functional properties that meet the needs of an advanced device or drug requires knowledge and control of the atomic level structure of the material. The atomistic configuration can often be the decisive factor in whether the device works as intended, because the materials' macroscopic properties - such as electrical and thermal conductivity - stem from the atomic level. However, such systems are difficult to study experimentally and have so far been infeasible to study computationally due to costly simulations. I describe the theory and practical implementation of a 'building block'-based Bayesian Optimization Structure Search (BOSS) method to efficiently address heterogeneous interface optimization problems. This machine learning method is based on accelerating the identification of a material's energy landscape with respect to the number of quantum mechanical (QM) simulations executed. The acceleration is realized by applying likelihood-free Bayesian inference scheme to evolve a Gaussian process (GP) surrogate model of the target landscape. During this active learning, various atomic configurations are iteratively sampled by running static QM simulations. An approximation of using chemical building blocks reduces the search phase space to manageable dimensions. This way the most favored structures can be located with as little computation as possible. Thus it is feasible to do structure search with large simulation cells, while still maintaining high chemical accuracy. The BOSS method was implemented as a python code called aalto-boss between 2016-2019, where I was the main author in co-operation with Milica Todorović and Patrick Rinke. I conducted a dimensional scaling study using analytic functions, which quantified the scaling of BOSS efficiency for fundamentally different functions when dimension increases. The results revealed the target function's derivative's important role to the optimization efficiency. The outcome will help people with choosing the simulation variables so that they are efficient to optimize, as well as help them estimate roughly how many BOSS iterations are potentially needed until convergence. The predictive efficiency and accuracy of BOSS was showcased in the conformer search of the alanine dipeptide molecule. The two most stable conformers and the characteristic 2D potential energy map was found with greatly reduced effort compared to alternative methods. The value of BOSS in novel materials research was showcased in the surface adsorption study of bifenyldicarboxylic acid on CoO thin film using DFT simulations. We found two adsorption configurations which had a lower energy than previous calculations and approximately supported the experimental data on the system. The three applications showed that BOSS can significantly reduce the computational load of atomistic structure search while maintaining predictive accuracy. It allows material scientists to study novel materials more efficiently, and thus help tailor the materials' properties to better suit the needs of modern devices.
-
(2020)Biogenic Volatile Organic Compounds play a major role in the atmosphere by acting as precursors in the formation of secondary organic aerosols and by also affecting the concentration of ozone. The chemical diversity of BVOCs is vast but global emissions are dominated by isoprene and monoterpenes. The emissions of BVOCs from plants are affected by environmental parameters with temperature and light having significant impacts on the emissions. The Downy birch and Norway spruce trees consist of heavy and low volatile compounds but published results are limited up to observing sesquiterpenoid emissions from these two trees. In this study, the Vocus proton-transfer-reaction time-of-flight mass spectrometer is deployed in the field to examine BVOC emissions from Downy birch and Norway spruce trees. With higher mass resolution, shorter time response and lower limits of detection than conventional PTR instruments, the Vocus can effectively measure a broader range of VOCs. For the first time, real-time emissions of diterpenes and 12 different oxygenated compounds were observed from birch and spruce trees. The emission spectrum of birch was dominated by C10H17+, while for spruce C5H9+ contributed the most. The sum emissions of oxygenated compounds contributed significantly to the observed total emissions from both the trees. The emission rates of all compounds varied dramatically throughout the period due to fluctuations in temperature and light. Due to lack of data from spruce, conclusive results for temperature and light response on terpene emissions could not be drawn. For birch, the emission rates were well explained by the temperature and temperature-light algorithms. The terpene emissions modelled using both algorithms correlated similarly with experimental data making it difficult to decisively conclude if the emissions originated from synthesis or pools.
-
(2020)Tutkielman kirjallisuusosuudessa on käyty läpi erilaisia kaupallisia biopolymeerejä, niiden synteesiä, käyttöä ja biohajoamista. Tutkielman pääpaino on erilaisten materiaalien biohajoamisessa ja näiden materiaalien kaupallisessa käytössä. Biohajoamisen evaluointiin tarkoitettuja standardeja, tutkimusmenetelmiä ja hyväksyntäkriteerejä on esitelty laajasti. Tutkimusosuudessa on valmistettu PLA:n ja PBAT:n seoksesta puukomposiitti ja materiaalin termomekaaniset ominaisuudet on karakterisoitu. Tavoitteena oli luoda biohajoava materiaali, jonka ominaisuudet ovat sellaisia, että sen kaupallinen hyödyntäminen kertakäyttömuovin korvikkeena on järkevää. Materiaalin mekaaniset ominaisuudet karakterisoitiin lopputuotteen kestävyyden, ja sulaominaisuudet kaupallisen tuotannon mahdollistamisen takia. Termomekaanisia analyysejä tehtiin materiaalin säilyvyyden ja lämpöominaisuuksien karakterisoimiseksi. Työssä on tutkittu myös puhtaan PLA/puukomposiitin biohajoamista meriympäristössä. Tutkimuksen tuloksena saatiin luotua riittävällä nopeudella biohajoava puukomposiitti, jonka mekaaniset ominaisuudet ovat riittäviä korvaamaan erilaisia kertakäyttöisiä muovituotteita ja joka on prosesoitavissa nykyisillä ekstruusiolaitteistoilla.
-
(2020)NMR Services Australia (NMRSA) Pty Ltd has developed a Borehole Magnetic Resonance (BMR) tool which is based on the principles of nuclear magnetic resonance (NMR). Drillhole NMR tools have been used mostly in sedimentary environments for oil and gas exploration while applications in hard, heterogeneous, crystalline bedrock are still lacking. This study aims to test the BMR method in a hard rock environment, and for determining hydrogeological parameters in the spent nuclear fuel disposal site, the Olkiluoto island. Essentially, the objective is to design an optimal BMR data processing workflow and calibrate the estimated hydrogeological parameters, currently optimized for data from sedimentary environments, to suit the crystalline bedrock. For testing the BMR method in hard, crystalline bedrock, Posiva Oy, the Finnish expert organization responsible of spent nuclear fuel disposal, made test measurements in the drillholes of the spent nuclear fuel repository site, island of Olkiluoto. The collected data was processed with WellCAD software using additional NMR module. The BMR tool derives T2 distribution (representing pore size distribution), total porosity, bound water and moveable water volumes and permeability calculated with two different models. Some processing parameters (main/burst sequence, moving averages, temperature gradient, cutoff values) were tested and adjusted to fit into crystalline bedrock. Magnetizing material of the surface environment strongly disturbed the uppermost ~20.0 m portions of the measurement data. Some noise was encountered also deep in bedrock, which was cut away from the signal. A list of criteria was created for recognizing noise. The BMR data was compared with other drillhole data acquired by Posiva, i.e. fracture and lithology logs, seismic velocities and hydrogeological measurements. It was observed that the T2 distribution and total porosity correlate rather well to logged fractures and seismic velocities. Lithological variations did not correlate to BMR consistently, mostly because of the strong dependency on fracturing. Permeabilities were compared to earlier conducted hydrogeological measurements, with an intention to calibrate the permeability calculation models. However, this proved to be challenging due to the significant differences of the BMR method and conventional hydrogeological measurements. Preferably, the permeability models should be calibrated by laboratory calibration of the drillhole core, and possibly a new permeability model suitable for crystalline bedrock should be created.
-
(2020)Urban areas account for 70% of worldwide energy-related CO2 emissions and play a significant role in the global carbon budget. With the enhanced consumption of fossil fuel and the dramatic change in land use related to urbanization, control and mitigation of CO2 emissions in the urban area is becoming a major concern for urban dwellers and city managers. It is of great importance and demand to estimate the local CO2 emissions in urban areas to assess the effectiveness of mitigation regulation. Surface Urban Energy and Water Balance Scheme (SUEWS) incorporated with a CO2 exchange module provides an advanced method to model total urban CO2 flux and quantify the different local-scale emission sectors involving transportation, human metabolism, buildings and vegetation. Using appropriate input data such as detailed site information and meteorological condition, it can simulate the local or neighbourhood scale CO2 emissions in a specific period, or even under a future scenario. In this study, the SUEWS model is implemented in an urban region, Jätkäsaari, which is an extension of Helsinki city centre, to simulate anthropogenic and biogenic CO2 emissions in the past and future. The construction of this district started in 2009 and was planned to be completed in 2030. Therefore, this region is a good case to investigate the impacts of urban planning on urban CO2 emissions. Based on the urban surface information, meteorological data, and abundant emission parameters, a simulation in this 1650 × 1400 m area with the spatial resolution of 50 × 50 m and the time resolution of an hour was conducted with the aim to get information on the total annual CO2 emissions, and the temporal and spatial variability of CO2 fluxes from different sources and sink in 2008 and 2030. The positive CO2 fluxes indicate the CO2 sources, while the negative indicate the CO2 sinks. In both of the previous and future case, the spatial variation of net CO2 fluxes in Jätkäsaari is dominated by the distribution of traffic and human activities. From April to September, the vegetation acts as the CO2 sink with negative net ecosystem exchange. In 2008, the modelled cumulative CO2 flux is 3.0 kt CO2 year-1, consisting of 1.9 kt CO2 year-1 from metabolism, 1.9 kt CO2 year-1 from traffic, 0.5 kt CO2 year-1 from soil and vegetation respiration, as well as -1.3 kt CO2 year-1 from photosynthesis. In 2030, the total annual CO2 emissions increase to 11.1 kt CO2 year-1 because of the rising traffic volume and amount of inhabitants. Road traffic became the dominant CO2 sources, accounting for 53% of the total emissions. For the diurnal variation, in 2008, the study area remains the CO2 sources with the exception of summertime morning when the net CO2 flux is negative, while in 2030, the net CO2 flux is positive in the whole day.
-
(2020)Tässä tutkielmassa esitetään Cauchy-Eulerin yhtälö, sen ratkaisu ja kaksi sovellusta sen monista sovelluksista. Cauchy-Eulerin yhtälö on homogeeninen lineaarinen differentiaaliyhtälö, jolla on muuttujakertoimet. Ensimmäisessä luvussa perustellaan aiheen valinta sekä kerrotaan perustietoja lineaarisista differentiaaliyhtälöistä ja Cauchy-Eulerin yhtälön historiasta. Toisessa luvussa esitetään Cauchy-Eulerin yhtälö ja osa yhtälön ratkaisun todistukseen tarvittavista aputuloksista. Kolmannessa luvussa todistetaan sekä toisen kertaluvun että n:nnen kertaluvun ratkaisu yhtälölle. Molempia todistuksia ennen esitetään todistuksien kannalta merkittävimmät aputulokset. Tärkeimpänä esimerkkinä mainittakoon Laplace-muunnos. Toisen kertaluvun ratkaisu todistetaan, koska se on helpompi ymmärtää, sitä tarvitaan molempiin sovelluksiin, ja koska se auttaa ymmärtämään n:nnen kertaluvun ratkaisua. Neljännessä luvussa yhtälölle esitetään kaksi sovellusta: Laplacen yhtälön napakoordinaattiesityksen ratkaisu ja Black-Scholesin yhtälön ratkaisu. Laplacen yhtälöä hyödynnetään kuvaamaan fysiikassa ajasta riippumattomissa tilanteissa tapahtuvia muutoksia esimerkiksi sähkömagneettisissa potentiaaleissa, tasaisissa lämpötiloissa ja hydrodynamiikassa. Yhtälön napakoordinaattiesitystä käytetään sellaisissa tilanteissa, joissa ympäristö on ympyrän rajaama kiekko. Black-Scholesin yhtälöä käytetään finanssimatematiikassa kuvaamaan osakeoptioiden arvonmuutosta. Siten molempia yhtälöitä käytetään paljon, ja ne ovat CauchyEulerin yhtälön tärkeitä sovelluksia. Viidennessä luvussa esitellään tutkielman tulokset. Tuloksina esitetään Cauchy-Eulerin yhtälön n:nnen kertaluvun ratkaisu, Laplacen yhtälön napakoordinaattiesityksen ratkaisu ja Black-Scholesin yhtälön ratkaisu. Sekä Laplacen yhtälön napakoordinaattiesityksen että Black-Scholesin yhtälön ratkaisu saadaan muuttujien separoinnin avulla, jolloin saadaan kaksi eri yhtälöä, joista toinen on toisen kertaluvun Cauchy-Eulerin yhtälö, jonka ratkaisu aiemmin todistettiin.
-
(2020)Computing an edit distance between strings is one of the central problems in both string processing and bioinformatics. Optimal solutions to edit distance are quadratic to the lengths of the input strings. The goal of this thesis is to study a new approach to approximate edit distance. We use a chaining algorithm presented by Mäkinen and Sahlin in "Chaining with overlaps revisited" CPM 2020 implemented verbatim. Building on the chaining algorithm, our focus is on efficiently finding a good set of anchors for the chaining algorithm. We present three approaches to computing the anchors as maximal exact matches: Bi-Directional Burrows-Wheeler Transform, Minimizers, and lastly, a hybrid implementation of the two. Using the maximal exact matches as anchors, we can efficiently compute an optimal chaining alignment for the strings. The chaining alignment further allows us to determine all such intervals where mismatches occur by looking at which sequences are not in the chain. Using these smaller intervals lets us approximate edit distance with a high degree of accuracy and a significant speed improvement. The methods described present a way to approximate edit distance in time complexity bounded by the number of maximal exact matches.
-
(2019)Cities are facing pressure to overcome critical challenges that force us to rethink our unsustainable mobility patterns. Therefore, the transportation sector is going through major changes. Mobility as a Service (MaaS) is one of the innovations trying to change how we travel, a concept that originates from Finland. MaaS is a concept that brings all the transport providers and modes into one platform. A distinctive feature of MaaS is the possibility to buy tickets for the entire journey, removing the need to go through multiple websites and ticket schemes. However, MaaS is still an emerging concept and therefore it lacks official definition. Finland has been in the forefront of this transportation reform with new legislation that supports the creation of MaaS. The public sector has traditionally had a central role in the provision of transport services where regulation and subsidies are needed. However, the new legislation strongly advocates market-based services, and thus the public sector needs to reconsider their position. Therefore, it is important to understand how the Finnish public sector and the parties actually executing the law sees MaaS, its impacts and their role in MaaS. The thesis is qualitative in nature and 20 public sector representatives were interviewed from 17 different organizations. The organizations consist of governmental organizations, interest groups, regional organizations and cities that vary in size. The interview analysis has been guided by concept of emerging technology. Emerging technology is characterized of being technology that can change multiply sectors at the same time but simultaneously has not yet demonstrated its value. The results showed that there is big variety how public sector representatives define MaaS. Additionally, the respondents felt there is a lot of challenges related to MaaS, such as working business model, lack of services, technical challenges, area of demand among others. Positive side was if MaaS would make transport more efficient and provide savings for the public sector. User wise it was clear that MaaS needs to be effortless for the user in order to compete with private cars. Overall the respondents saw more opportunities for MaaS than possible negative effects, but the lack of widespread MaaS scheme makes it hard to evaluate any effects. However, MaaS raised also suspicions among some respondents. As for the legislation, it did not gather any positive feedback outside of government officials, especially the openness of the drafting process received criticism. The results also showed that there is contradicting view on the roles among the different groups of representatives. In conclusion it should be taken into consideration how future policies are formed as now the experienced exclusion of drafting the legislation might have hindered the cooperation and created suspicion towards the whole concept. Additionally, it is clear there is insecurities inside the public sector caused by uncertainties related to MaaS. Implementation has been slow since public sector feels the government has told them to do something, they do not have ability to do. Nevertheless, generally the public sector is still welcoming MaaS. Especially cities hoped that MaaS would enable them to cut their service in low dense areas. However, there is still no will to financially support MaaS, it seen that it is a job for private sector to take the risks.
-
(2020)Cadmium Telluride (CdTe) has a high quantum efficiency and a bandgap of 1.44 eV. As a consequence, it is being used to efficiently detect gamma rays. The aim of this thesis is to explore the properties of the CdTe pixelated detector and the procedures conducted in order to fine-tune the electronic readout system. A fully functional CdTe detector would be useful in medical imaging techniques such as Boron Neutron Capture Therapy (BNCT). BNCT requires a detector with a good energy resolution, a good timing resolution and a good stopping power. Although the CdTe crystal is a promising material, its growing process is difficult due to the fact that different types of defects appear inside the crystal. The quality assurance process has to be thorough in order for suitable crystals to be found. An aluminum oxide layer (Al2O3) was passivated onto the surface of the crystal. The contacts for both sides were created using Titanium Tungsten (TiW) and gold (Au) sputtering deposition, followed by an electroless nickel growth. I tested the CdTe pixelated detector with different radioactive sources such as Am-241, Ba-133, Co-57, Cs-137 and X-ray quality series in order to study the sensitivity of the device and its capacity to detect gamma and X-rays.
Now showing items 21-40 of 270