Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by discipline "none"

Sort by: Order: Results:

  • Anttila, Jesse (2020)
    Visual simultaneous localization and mapping (visual SLAM) is a method for consistent self-contained localization using visual observations. Visual SLAM can produce very precise pose estimates without any specialized hardware, enabling applications such as AR navigation. The use of visual SLAM in very large areas and over long distances is not presently possible due to a number of significant scalability issues. In this thesis, these issues are discussed and solutions for them explored, culminating in a concept for a real-time city-scale visual SLAM system. A number of avenues for future work towards a practical implementation are also described.
  • Martinmäki, Tatu (2020)
    Tiivistelmä – Referat – Abstract Molecular imaging is visualization, characterization and quantification of biological processes at molecular and cellular levels of living organisms, achieved by molecular imaging probes and techniques such as radiotracer imaging, magnetic resonance imaging and ultrasound imaging. Molecular imaging is an important part of patient care. It allows detection and localization of disease at early stages, and it is also an important tool in drug discovery and development. Positron emission tomography (PET) is a biomedical imaging technique considered as one of the most important advances in biomedical sciences. PET is used for a variety of biomedical applications: i.e. imaging of divergent metabolism, oncology and neurology. PET is based on incorporation of positron emitting radionuclides to drug molecules. As prominent radionuclides used in PET are of short or ultra-short half-lives, the radionuclide is most often incorporated to the precursor in the last step of the synthesis. This has proven to be a challenge with novel targeted radiotracers, as the demand for high specific activity leads to harsh reaction conditions, often with extreme pH and heat which could denature the targeting vector. Click chemistry is a synthetic approach based on modular building blocks. The concept was originally developed for purposes of drug discovery and development. It has been widely utilized in radiopharmaceutical development for conjugating prosthetic groups or functional groups to precursor molecules. Click chemistry reactions are highly selective and fast due to thermodynamic driving force and occur with high kinetics in mild reaction conditions, which makes the concept ideal for development and production of PET radiopharmaceuticals. Isotope exchange (IE) radiosynthesis with trifluoroborate moieties is an alternative labeling strategy for a reasonably high yield 18F labeling of targeted radiopharmaceuticals. As the labeling conditions in IE are milder than in commonly utilized nucleophilic fluorination, the scope of targeting vectors can be extended to labile biomolecules expressing highly specific binding to drug targets, resulting to higher contrast in PET imaging. A trifluoroborate functionalized prosthetic group 3 was synthetized utilizing click chemistry reactions, purified with SPE and characterized with HPLC-MS and NMR (1H , 11B-, 13C-, 19F-NMR). [18F]3 was successfully radiolabeled with RCY of 20.1 %, incorporation yield of 22.3 ± 11.4 % and RCP of >95 %. TCO-functionalized TOC-peptide precursor 6 was synthetized from a commercial octreotide precursor and a commercially available click chemistry building block via oxime bond formation. 6 was characterized with HPLC-MS and purified with semi preparative HPLC. Final product [18F]7 was produced in a two-step radiosynthesis via IEDDA conjugation of [18F]3 and 6. [18F]7 was produced with RCY 1.0 ± 1.0 %, RCP >95 % and estimated molar activity of 0.7 ± 0.8 GBq/µmol. A cell uptake study was conducted with [18F]7 in AR42J cell line. Internalization and specific binding to SSTR2 were observed in vitro.
  • Koivisto, Teemu (2021)
    Programming courses often receive large quantities of program code submissions to exercises which, due to their large number, are graded and students provided feedback automatically. Teachers might never review these submissions therefore losing a valuable source of insight into student programming patterns. This thesis researches how these submissions could be reviewed efficiently using a software system, and a prototype, CodeClusters, was developed as an additional contribution of this thesis. CodeClusters' design goals are to allow the exploration of the submissions and specifically finding higher-level patterns that could be used to provide feedback to students. Its main features are full-text search and n-grams similarity detection model that can be used to cluster the submissions. Design science research is applied to evaluate CodeClusters' design and to guide the next iteration of the artifact and qualitative analysis, namely thematic synthesis, to evaluate the problem context as well as the ideas of using software for reviewing and providing clustered feedback. The used study method was interviews conducted with teachers who had experience teaching programming courses. Teachers were intrigued by the ability to review submitted student code and to provide more tailored feedback to students. The system, while still a prototype, is considered worthwhile to experiment on programming courses. A tool for analyzing and exploring submissions seems important to enable teachers to better understand how students have solved the exercises. Providing additional feedback can be beneficial to students, yet the feedback should be valuable and the students incentivized to read it.
  • Martesuo, Kim (2019)
    Creating a user interface (UI) is often a part of software development. In the software industry designated UI designers work side by side with the developers in agile software development teams. While agile software processes have been researched, yet there is no general consensus on how UI designers should be integrated with the developing team. The existing research points towards the industry favoring tight collaboration between developers and UI designers by having them work together in the same team. The subject is gathering interest and different ways of integration is happening in the industry. In this thesis we researched the collaboration between developers and UI designers in agile software development. The goal was to understand the teamwork between the UI designers and developers working in the same agile software teams. The research was conducted by doing semi-structured theme interviews with UI designers and devel- opers individually. The interviewees were from consulting firms located in the Helsinki metropolitan are in Finland. The subjects reported about a recent project where they worked in an agile software team consisting of UI designers and developers. The data from the interviews was compared to the literature. Results of the interviews were similar to the findings from the literature for the most part. Finding a suitable process for the teamwork, co-location, good social relations and a an atmosphere of trust were factors present in the literature and the interviews. The importance of good software tools for communicating designs, and developers taking part in the UI designing process stood out from the interviews.
  • Rautio, Siiri (2019)
    Improving the quality of medical computed tomography reconstructions is an important research topic nowadays, when low-dose imaging is pursued to minimize the X-ray radiation afflicted on patents. Using lower radiation doses for imaging leads to noisier reconstructions, which then require postprocessing, such as denoising, in order to make the data up to par for diagnostic purposes. Reconstructing the data using iterative algorithms produces higher quality results, but they are computationally costly and not quite powerful enough to be used as such for medical analysis. Recent advances in deep learning have demonstrated the great potential of using convolutional neural networks in various image processing tasks. Performing image denoising with deep neural networks can produce high-quality and virtually noise-free predictions out of images originally corrupted with noise, in a computationally efficient manner. In this thesis, we survey the topics of computed tomography and deep learning for the purpose of applying a state-of-the-art convolutional neural network for denoising dental cone-beam computed tomography reconstruction images. We investigate how the denoising results of a deep neural network are affected if iteratively reconstructed images are used in training the network, as opposed to using traditionally reconstructed images. The results show that if the training data is reconstructed using iterative methods, it notably improves the denoising results of the network. Also, we believe these results can be further improved and extended beyond the case of cone-beam computed tomography and the field of medical imaging.
  • Jääskeläinen, Matias (2020)
    This thesis is about exploring descriptors for atmospheric molecular clusters. Descriptors are needed for applying machine learning methods for molecular systems. There is a collection of descriptors readily available in the DScribe-library developed in Aalto University for custom machine learning applications. The question of which descriptors to use is up to the user to decide. This study takes the first steps in integrating machine learning into existing procedure of configurational sampling that aims to find the optimal structure for any given molecular cluster of interest. The structure selection step forms a bottleneck in the configurational sampling procedure. A new structure selection method presented in this study uses k-means clustering to find structures that are similar to each other. The clustering results can be used to discard redundant structures more effectively than before which leaves fewer structures to be calculated with more expensive computations. Altogether that speeds up the configurational sampling procedure. To aid the selection of suitable descriptor for this application, a comparison of four descriptors available in DScribe is made. A procedure for structure selection by representing atmospheric clusters with descriptors and labeling them into groups with k-means was implemented. The performance of descriptors was compared with a custom score suitable for this application, and it was found that MBTR outperforms the other descriptors. This structure selection method will be utilized in the existing configurational sampling procedure for atmospheric molecular clusters but it is not restricted to that application.
  • Ottensmann, Linda (2020)
    It is challenging to identify causal genes and pathways explaining the associations with diseases and traits found by genome-wide association studies (GWASs). To solve this problem, a variety of methods that prioritize genes based on the variants identified by GWASs have been developed. In this thesis, the methods Data-driven Expression Prioritized Integration for Complex Traits (DEPICT) and Multi-marker Analysis of GenoMic Annotation (MAGMA) are used to prioritize causal genes based on the most recently published publicly available schizophrenia GWAS summary statistics. The two methods are compared using the Benchmarker framework, which allows an unbiased comparison of gene prioritization methods. The study has four aims. Firstly, to explain what are the differences between the gene prioritization methods DEPICT and MAGMA and how the two methods work. Secondly, to explain how the Benchmarker framework can be used to compare gene prioritization methods in an unbiased way. Thirdly, to compare the performance of DEPICT and MAGMA in prioritizing genes based on the latest schizophrenia summary statistics from 2018 using the Benchmarker framework. Lastly, to compare the performance of DEPICT and MAGMA on a schizophrenia GWAS with a smaller sample size by using Benchmarker. Firstly, the published results of the Benchmarker analyses using schizophrenia GWAS from 2014 were replicated to make sure that the framework is run correctly. The results were very similar and both the original and the replicated results show that DEPICT and MAGMA do not perform significantly differently. Furthermore, they show that the intersection of genes prioritized by DEPICT and MAGMA outperforms the outersection, which is defined as genes prioritized by only one of these methods. Secondly, Benchmarker was used to compare the performance of DEPICT and MAGMA on prioritizing genes using the schizophrenia GWAS from 2018. The results of the Benchmarker analyses suggest that DEPICT and MAGMA perform similarly with the GWAS from 2018 compared to the GWAS from 2014. Furthermore, an earlier schizophrenia GWAS from 2011 was used to check if the performance of DEPICT and MAGMA differs when a GWAS with lower statistical power is used. The results of the Benchmarker analyses make clear that MAGMA performs better than DEPICT in prioritizing genes using this smaller data set. Furthermore, for the schizophrenia GWAS from 2011 the outersection of genes prioritized by DEPICT and MAGMA outperforms the intersection. To conclude, the Benchmarker framework is a useful tool for comparing gene prioritization methods in an unbiased way. For the most recently published schizophrenia GWAS from 2018 there is no significant difference between the performance of DEPICT and MAGMA in prioritizing genes according to Benchmarker. For the smaller schizophrenia GWAS from 2011, however, MAGMA outperformed DEPICT.
  • Pusfitasari, Eka Dian (2019)
    Urine can be used to determine human exposure to nerve agents through the analysis of specific biomarkers. Isopropyl methylphosphonic acid (IMPA) is an important marker of sarin nerve agent, a highly toxic chemical regulated under the Chemical Weapons Convention (CWC). A methodology for sensitive, reliable, and selective determination of IMPA in urine matrix was developed and validated, using liquid chromatography-tandem mass spectrometry (LC-MS/MS). The sample preparation method employs normal phase–solid phase extraction (NP-SPE) using silica based cartridge. Before conducting IMPA analysis, the instrument performance was controlled using a quality control sample. Three different ion sources, namely electrospray ionization (ESI), Unispray, and atmospheric pressure chemical ionization (APCI), were compared in order to define the best method for trace analysis of targeted IMPA. Parameters affecting the ionization process such as cone voltage, capillary voltage, impactor pin voltage, corona voltage, and mobile phase flow rate were optimized. Negative ion mode was selected as the best method for IMPA identification in all three ion sources, and multiple reactions monitoring (MRM) was employed to improve sensitivity and selectivity. The APCI source was shown to be the least sensitive and least efficient ionization technique for IMPA identification. In contrast, using ESI and Unispray resulted in satisfactory data with excellent limit of detection (LOD), limit of quantification (LOQ), precision, and accuracy. The two latter ion sources share the same values of those parameters, i.e. 0.44 ng/mL, 1.46 ng/mL, < 4% precision bias, < 5% accuracy bias, for ESI; and 0.42 ng/mL, 1.38 ng/mL, < 4% precision bias, < 4% accuracy bias, for Unispray. Nonetheless, the Unispray shows better performance in comparison to ESI in producing higher signal intensity/peak area and has lower matrix effect.
  • Länsman, Olá-Mihkku (2020)
    Demand forecasts are required for optimizing multiple challenges in the retail industry, and they can be used to reduce spoilage and excess inventory sizes. The classical forecasting methods provide point forecasts and do not quantify the uncertainty of the process. We evaluate multiple predictive posterior approximation methods with a Bayesian generalized linear model that captures weekly and yearly seasonality, changing trends and promotional effects. The model uses negative binomial as the sampling distribution because of the ability to scale the variance as a quadratic function of the mean. The forecasting methods provide highest posterior density intervals in different credible levels ranging from 50% to 95%. They are evaluated with proper scoring function and calculation of hit rates. We also measure the duration of the calculations as an important result due to the scalability requirements of the retail industry. The forecasting methods are Laplace approximation, Monte Carlo Markov Chain method, Automatic Differentiation Variational Inference, and maximum a posteriori inference. Our results show that the Markov Chain Monte Carlo method is too slow for practical use, while the rest of the approximation methods can be considered for practical use. We found out that Laplace approximation and Automatic Differentiation Variational Inference have results closer to the method with best analytical quarantees, the Markov Chain Monte Carlo method, suggesting that they were better approximations of the model. The model faced difficulties with highly promotional, slow selling, and intermittent data. Best fit was provided with high selling SKUs, for which the model provided intervals with hit rates that matched the levels of the credible intervals.
  • Nissilä, Viivi (2020)
    Origin-Destination (OD) data is a crucial part of price estimation in the aviation industry, and an OD flight is any number of flights a passenger takes in a single journey. OD data is a complex set of data that is both flow and multidimensional type of data. In this work, the focus is to design interactive visualization techniques to support user exploration of OD data. The thesis work aims to find which of the two menu designs suit better for OD data visualization: breadth-first or depth-first menu design. The two menus follow Schneiderman’s Task by Data Taxonomy, a broader version of the Information Seeking Mantra. The first menu design is a parallel, breadth-first menu layout. The layout shows the variables in an open layout and is closer to the original data matrix. The second menu design is a hierarchical, depth-first layout. This layout is derived from the semantics of the data and is more compact in terms of screen space. The two menu designs are compared in an online survey study conducted with the potential end users. The results of the online survey study are inconclusive, and therefore are complemented with an expert review. Both the survey study and expert review show that the Sankey graph is a good visualization type for this work, but the interaction of the two menu designs requires further improvements. Both of the menu designs received positive and negative feedback in the expert review. For future work, a solution that combines the positives of the two designs could be considered. ACM Computing Classification System (CCS): Human-Centered Computing → Visualization → Empirical Studies in Visualization Human-centered computing → Interaction design → Interaction design process and methods → Interface design prototyping
  • Sassi, Sebastian (2019)
    When the standard model gauge group SU(3) × SU(2) × U(1) is extended with an extra U(1) symmetry, the resulting Abelian U(1) × U(1) symmetry introduces a new kinetic mixing term into the Lagrangian. Such double U(1) symmetries appear in various extensions of the standard model and have therefore long been of interest in theoretical physics. Recently this kinetic mixing has received attention as a model for dark matter. In this thesis, a systematic review of kinetic mixing and its physical implications is given, some of the dark matter candidates relying on kinetic mixing are considered, and experimental bounds for kinetic mixing dark matter are discussed. In particular, the process of diagonalizing the kinetic and mass terms of the Lagrangian with a suitable basis choice is discussed. A rotational ambiquity arises in the basis choice when both U(1) fields are massless, and it is shown how this can be addressed. BBN bounds for a model with a fermion in the dark sector are also given based on the most recent value of the effective number of neutrino species, and it is found that a significant portion of the FIMP regime is excluded by this constraint.
  • Ahonen, Heikki (2020)
    The research group dLearn.Helsinki has created a software for defining the work life competence skills of a person, working as a part of a group. The software is a research tool for developing the mentioned skills of users, and users can be of any age, from school children to employees in a company. As the users can be of different age groups, the data privacy of different groups has to be taken into consideration from different aspects. Children are more vulnerable than adults, and may not understand all the risks imposed to-wards them. Thus in the European Union the General Data Protection Regulation (GDPR)determines the privacy and data of children are more protected, and this has to be taken into account when designing software which uses said data. For dLearn.Helsinki this caused changes not only in the data handling of children, but also other users. To tackle this problem, existing and future use cases needed to be planned and possibly implemented. Another solution was to implement different versions of the software, where the organizations would be separate. One option would be determining organizational differences in the existing SaaS solution. The other option would be creating on-premise versions, where organizations would be locked in accordance to the customer type. This thesis introduces said use cases, as well as installation options for both SaaS and on-premise. With these, broader views of data privacy and the different approaches are investigated, and it can be concluded that no matter the approach, the data privacy of children will always prove a challenge.
  • Koskimaa, Kuutti (2020)
    AA Sakatti Mining Oy is researching the possibility of conducting mining operations in Sakatti ore deposit, located partially under the protected Viiankiaapa mire. In order to understand the waters in mining development site, the interactions of surface waters, shallow aquifers, and deep bedrock groundwaters must be understood. To estimate these interactions, hydrogeochemical characterization, together with four tracer methods were used: Tritium/helium, dichlorodifluoromethane and sulfur hexafluoride, stable isotopes of hydrogen and oxygen, and carbon-14. Most of the shallow groundwater samples are similar to the natural precipitation and groundwater in their chemical composition, being of Calcium bicarbonate type. B-11-17HYD013 was an exception, containing much more Cl and SO4. The samples from the deep 17MOS8193 all show a very typical composition for this type of a borehole, on the line between the saline Sodium sulphate and Sodium chloride water types. The samples from the 12MOS8102, as well as the river water samples and the Rytikuru spring sample are located between these two end members. The hydrogen and oxygen isotope values divided the samples into two distinct groups: those that show evaporation signal in the source water, and those that do not. The most likely source for the evaporated signal in the groundwaters is in the surface water pools in the Viiankiaapa mire, which have then infiltrated into the groundwater and followed the known groundwater flow gradient into the observation wells near the River Kitinen. Tritium showed no inclusion of recently recharged water in the deep 17MOS8193, and dated most of the shallow wells with screen below bedrock surface to be recharged in the 70’s and 80’s. B-10-17HYD017 had an older apparent age from 1955, and B-14-17HYD006 was curiously dated to be recharged in 2018. 14C gave apparent age of over 30 000 a for the deep 17MOS8193. The slight contents of 14C could be caused by slight contamination during sampling meaning the age is a minimum. The sample M-4-12MOS8102 got an apparent age of ~3 500 a, which could in turn be an overestimate due to ancient carbon being dissolved from the local bedrock fractures. CFC-12 showed apparent recharge dates from 1963 to 1975 in the shallow wells, and no recently recharged water in the deep 17MOS8193, and so was generally in line with the 14C and Tritium results, although some contamination had happened. SF6 concentrations exceeded possible concentrations considering other results, most likely due to underground generation, and the method was dismissed. By trace element composition, all samples from the deep 17MOS8139 are distinct from other samples and saw slight dilution in concentrations of most elements in the span of the test pumping. Other samples are more mixed and difficult to interpret, but some trends and connections are visible, such as the higher contents in wells with screens below the bedrock surface than those with screens above the bedrock surface, and the exceptionally high contents of many elements in B-13-17HYD004. Overall, the study did benefit from the large array of methods, showing no interaction between the deep bedrock groundwaters and shallow groundwaters or surface waters. The evaporated signal from the Viiankiaapa was clearly visible in the samples close to the River Kitinen.
  • Hertweck, Corinna (2020)
    In this work, we seek robust methods for designing affirmative action policies for university admissions. Specifically, we study university admissions under a real centralized system that uses grades and standardized test scores to match applicants to university programs. For the purposes of affirmative action, we consider policies that assign bonus points to applicants from underrepresented groups with the goal of preventing large gaps in admission rates across groups, while ensuring that the admitted students are for the most part those with the highest scores. Since such policies have to be announced before the start of the application period, there is uncertainty about which students will apply to which programs. This poses a difficult challenge for policy-makers. Hence, we introduce a strategy to design policies for the upcoming round of applications that can either address a single or multiple demographic groups. Our strategy is based on application data from previous years and a predictive model trained on this data. By comparing this predictive strategy to simpler strategies based only on application data from, e.g., the previous year, we show that the predictive strategy is generally more conservative in its policy suggestions. As a result, policies suggested by the predictive strategy lead to more robust effects and fewer cases where the gap in admission rates is inadvertently increased through the suggested policy intervention. Our findings imply that universities can employ predictive methods to increase the reliability of the effects expected from the implementation of an affirmative action policy.
  • Rannisto, Meeri (2020)
    Bat monitoring is commonly based on audio analysis. By collecting audio recordings from large areas and analysing their content, it is possible estimate distributions of bat species and changes in them. It is easy to collect a large amount of audio recordings by leaving automatic recording units in nature and collecting them later. However, it takes a lot of time and effort to analyse these recordings. Because of that, there is a great need for automatic tools. We developed a program for detecting bat calls automatically from audio recordings. The program is designed for recordings that are collected from Finland with the AudioMoth recording device. Our method is based on a median clipping method that has previously shown promising results in the field of bird song detection. We add several modifications to the basic method in order to make it work well for our purpose. We use real-world field recordings that we have annotated to evaluate the performance of the detector and compare it to two other freely available programs (Kaleidoscope and Bat Detective). Our method showed good results and got the best F2-score in the comparison.
  • Ikkala, Tapio (2020)
    This thesis presents a scalable method for identifying anomalous periods of non-activity in short periodic event sequences. The method is tested with real world point-of-sale (POS) data from grocery retail setting. However, the method can be applied also to other problem domains which produce similar sequential data. The proposed method models the underlying event sequence as a non-homogeneous Poisson process with a piecewise constant rate function. The rate function for the piecewise homogeneous Poisson process can be estimated with a change point detection algorithm that minimises a cost function consisting of the negative Poisson log-likelihood and a penalty term that is linear to the number of change points. The resulting model can be queried for anomalously long periods of time with no events, i.e., waiting times, by defining a threshold below which the waiting time observations are deemed anomalies. The first experimental part of the thesis focuses on model selection, i.e., in finding a penalty value that results in the change point detection algorithm detecting the true changes in the intensity of the arrivals of the events while not reacting to random fluctuations in the data. In the second experimental part the performance of the anomaly detection methodology is measured against stock-out data, which gives an approximate ground truth for the termination of a POS event sequence. The performance of the anomaly detector is found to be subpar in terms of precision and recall, i.e., the true positive rate and the positive predictive value. The number of false positives remains high even with small threshold values. This needs to be taken into account when considering applying the anomaly detection procedure in practice. Nevertheless, the methodology may have practical value in the retail setting, e.g., in guiding the store personnel where to focus their resources in ensuring the availability of the products.
  • Tanskanen, Ville (2020)
    Microbial volatile organic compounds are emitted by diverse set of microbial organisms and they are known to cause health hazards when present in indoor air. Early detection of fungal contaminated buildings and species present is crucial to prevent health problems caused by fungal secondary metabolites. This thesis focuses on analysing emission profiles of different insulation materials and fungal cultures, which allows, in further studies, to develop efficient new ways to detect fungi from contaminated buildings. Studied insulation materials consisted of cellulose and glass wool, which were analysed in multiple different conditions. Humidity of atmosphere was varied between 0-10 microliters and temperature was varied between 30°C and 40°C. In fungal emission profile study 24 different cultures were analysed in two different atmospheres, ambient and micro- aerophilic, and in multiple different inoculums. Analysis for both insulation materials and fungal cultures was done using headspace solid phase microextraction Arrow -tool and headspace in tube extraction –tool together with gas chromatography – mass spectrometry. One goal for this thesis was also test suitability of these methods for detection of fungal secondary metabolites. Comprehensive fungal emission profiles were successfully formed and new information from behaviour of insulation materials in different settings was found. In addition, new information about analysis methods and fungal behaviour in different atmospheres was found. Headspace solid phase microextraction Arrow with gas chromatography – mass spectrometry was found to be efficient, sensitive and timesaving method for indoor air study purposes. There were also many potential fungal culture specific biomarker compounds found for further study purposes.
  • Redmond Roche, Benjamin Heikki (2019)
    Significant changes in sea-ice variability have occurred in the northern North Atlantic since the last deglaciation, resulting in global scale shifts in climate. By inferring the dynamic changes of palaeo seaice to past changes in climate, it is possible to predict future changes in response to anthropogenic climate change. Diatoms allow for detailed reconstructions of palaeoceanographic and sea-ice conditions, both qualitatively, using information of species ecologies and quantitatively, via a transfer function based upon diatom species optima and tolerances of the variable to be reconstructed. Three diatom species comprising a large portion of the training set are proxies for the presence of sea ice: Fragilariopsis oceanica, Fragilariopsis reginae-jahniae and Fossula arctica, have currently been grouped into one species – F. oceanica – in the large diatom training set of the northern North Atlantic region. The clustering of the species may result in an imprecise reconstruction of sea ice that does not take into account all the available ecological information. The proportions of the three species were recounted from the original surface sediment slides alongside the additional chrysophyte cyst Archaeomonas sp. and statistically analysed using Canoco and the R software package eHOF. A core from Kangerlussuaq Trough comprising the Late Holocene (~690–1498 Common Era) was also recounted and analysed using C2. The separated diatom species and chrysophyte cyst Archaeomonas sp. exhibited different relationships to both sea-ice concentration (aSIC) and sea surface temperature (aSST). The separated F. oceanica is a ‘cold-mixed’ water species occurring at cold aSST and both low and high aSIC. High abundances occur in the marginal ice zone (MIZ) where surficial meltwater is high during the spring bloom, with additional inputs from glacial meltwaters nearshore. F. reginae-jahniae is a sea-ice associated species related to cold aSST and high aSIC. High abundances occur in the low salinity Arctic Water dominated MIZ which experiences significant aSIC. F. arctica is a sea-ice associated species related to cold aSST and high aSIC. High abundances occur in the low salinity Arctic Water dominated MIZ which experiences high aSIC, particularly in polynya conditions. F. arctica can be considered a characteristic polynya species at high abundances. Archaeomonas sp. is a ‘cold-mixed’ water species related to both cold and relatively warm aSST and low and high aSIC. High abundances occur in both relatively warm ice-free Atlantic Water and also in cold high aSIC Arctic Water conditions rendering it a more complex indicator for aSST or aSIC proxy. However, the aversion to MIZ conditions indicates that Archaeomonas sp. is associated with a relatively saline unstratified water column. This is the first time that the distribution and ecology of Archaeomonas sp. has been presented. As such, the ecology described here can be used in future studies. The separation of the three diatom species is crucial for the ecological interpretation of downcore assemblage changes. It is also crucial for the application of transfer functions in order to have greater precision in reconstructing aSIC and assessing the influence of Arctic Water or Atlantic Water, even at low abundances.
  • Heikkinen, Janne (2020)
    Subarctic ponds are important habitats for many freshwater species. The recent increase in global temperatures have stressed on the study of these habitats as rising water temperatures may have severe consequences to these cold and harsh ecosystems. Despite its importance, this topic has been largely overlooked in scientific research. Diatoms are microscopic, single-celled benthic algae, which are important indicators for environmental quality. Elevation is one of the main environmental variables controlling the composition and richness of diatom species as it shapes communities through several environmental variables such as temperature and water chemistry. The aim of this thesis was to illustrate the variability in diatom species richness and community composition along an elevational gradient in Kilpisjärvi and reveal the most important environmental drivers. As an additional focus, the applicability of the BenthoTorch sampling device was tested in measuring benthic algae biomass. Field and laboratory measurements were done using universal standards. Statistical analyses included multiple univariate and multivariate data analysis techniques. It was found that water pH, aluminium concentration and air temperature explained the variation in species richness and community composition the most. Elevation had only a secondary, non-significant role in shaping the diatom communities in subarctic ponds. Nearby sites showed similar compositions in terms of water chemistry and diatom communities. Biotope characterisation did not provide any further insight into the differences or similarities of diatom community composition or species richness. There were some differences in how genera responded to environmental variables. The centre of distributional range of many taxa was below the mid-point of the elevational gradient but species often occupied the whole elevational gradient. Rare taxa appeared at the ends of the elevational spectrum. The amount of singleton taxa was high (25.8%) and can be expected to increase with climate change. The BenthoTorch did provide reasonable results for benthic algae in the subarctic when compared to previous literature, but further research is required to grasp its full potential. More examination into the relationship between explanatory variables can be suggested (e.g. total phosphorus and ion balance) to gain better understanding on the changes in diatom species richness and community composition along elevational gradients.
  • Edvinsson, Pontus (2020)
    Socio-economic segregation has been increasing in Helsinki for decades and the relation between socioeconomic factors and educational outcomes have been discussed frequently recently and have been an important topic for politicians and researchers. An increasing segregation and dwindling school results in the more disadvantaged areas of Finland have been connected in various reports. The main objective in this master’s thesis is firstly to investigate the spatial socio-economic differences between school catchment areas of the 26 municipalities in the Uusimaa region. And secondly, the relationship between educational outcomes and socio-spatial segregation in Uusimaa, as the former research evidence has only documented the socio-spatial differentiation within the municipalitan core of the region. The aim is to analyze the relationship of the four different socio-economic variables of basic level education, higher education, unemployment and low income households in each school catchment area and present them with help of four different maps created in GIS. Lastly data consisting of educational outcomes from first year pupils (N=1 920) from 41 different schools in the Uusimaa region provided by Kansallinen koulutuksen arviointikeskus were analysed. The data consisted of two standardized tests, one regarding mathematics and one about the finnish language. These two tests were part of a longitudinal evaluation which started in the fall of 2018. The core finding of this study is that Helsinki is by far the area with the largest socio-economic differences between the school catchment areas in the Uusimaa region, where eastern Helsinki often displayed low socio-economic levels and where western Helsinki and southern Espoo often presented a high socio-economic level compared to the rest of the Uusimaa region. And that the educational results regarding the Finnish language had a stronger correlation with the socio-economic data compared to the mathematical educational outcomes. These findings offer new insights for Finnish educational policies and demonstrate the need for supporting schools in disadvantaged neighbourhoods in different types of urban and rural areas.