Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by department "none"

Sort by: Order: Results:

  • Rönnberg, Oskar (2020)
    Segregation is usually treated as a place-based phenomenon based on residential locations, but during the last ten years more emphasis has been put on understanding segregation as a multi-contextual phenomenon, where mobility in urban space affects the individual’s exposure to segregation. Such research has not yet been done in Helsinki, where socio-economic and ethnic segregation has been on the rise since the 1990’s, but there is anecdotal evidence of for example young people from disadvantaged neighbourhoods not being as mobile in the urban space as others. The aim of this study is to find out how socioeconomic differences and experiences from the past are linked to how people move around and use urban space in Helsinki. A survey study was carried out (N=1 266) in spring 2020 for the purposes of this research. The study is based on a self-selected sample, so the results cannot be generalized for the whole population. Spatial mobility is analyzed with four measures: which parts of the city the respondent usually moves around in, how often they visit the city center, how many of their everyday activities are located near their home, in the city center and in other neighbourhoods and municipalities, and how many of the listed places in the survey they had visited during the last year. The main research methods are linear regression, correlation analyses and statistical tests. Spatial mobility varies based on education, age, family background and mobility practices in the youth. These factors explain at most a quarter of the variance in mobility. Cultural and economic capital also correlates with mobility, but their explanatory power diminishes when education and age are controlled for. The spatial mobility is low for them who had small activity spaces in their youth, and especially for them who still live in the same neighbourhood. Those who live in the outer suburbs are among the least mobile and many of the respondents in Northeastern and Eastern Helsinki do not regularly visit Southern Helsinki. Even though there are many different factors that influence the level of mobility that are not ad-dressed in this study, the results confirm that family background and past experiences affect the individuals’ mobility practices. The results indicate that people who live in disadvantaged neighbourhoods risk exposure to segregation in different contexts of everyday life as a result of low mobility. As people with low education are underrepresented in the study, it is possible that there are some kind of immobilities in the city that have not been covered in this study. The results underline the need for more research in multi-contextual segregation and the experiences and conceptions of the city, especially regarding children and young people.
  • Sarapalo, Joonas (2020)
    The page hit counter system processes, counts and stores page hit counts gathered from page hit events from a news media company’s websites and mobile applications. The system serves a public application interface which can be queried over the internet for page hit count information. In this thesis I will describe the process of replacing a legacy page hit counter system with a modern implementation in the Amazon Web Services ecosystem utilizing serverless technologies. The process includes the background information, the project requirements, the design and comparison of different options, the implementation details and the results. Finally, I will show how the new system implemented with Amazon Kinesis, AWS Lambda and Amazon DynamoDB has running costs that are less than half of that of the old one’s.
  • Joosten, Rick (2020)
    In the past two decades, an increasing amount of discussions are held via online platforms such as Facebook or Reddit. The most common form of disruption of these discussions are trolls. Traditional trolls try to digress the discussion into a nonconstructive argument. One strategy to achieve this is to give asymmetric responses, responses that don’t follow the conventional patterns. In this thesis we propose a modern machine learning NLP method called ULMFiT to automatically detect the discourse acts of online forum posts in order to detect these conversational patterns. ULMFiT finetunes the language model before training its classifier in order to create a more accurate language representation of the domain language. This task of discourse act recognition is unique since it attempts to classify the pragmatic role of each post within a conversation compared to the functional role which is related to tasks such as question-answer retrieval, sentiment analysis, or sarcasm detection. Furthermore, most discourse act recognition research has been focused on synchronous conversations where all parties can directly interact with each other while this thesis looks at asynchronous online conversations. Trained on a dataset of Reddit discussions, the proposed model achieves a matthew’s correlation coefficient of 0.605 and an F1-score of 0.69 to predict the discourse acts. Other experiments also show that this model is effective at question-answer classification as well as showing that language model fine-tuning has a positive effect on both classification performance along with the required size of the training data. These results could be beneficial for current trolling detection systems.
  • Lange, Moritz Johannes (2020)
    In the context of data science and machine learning, feature selection is a widely used technique that focuses on reducing the dimensionality of a dataset. It is commonly used to improve model accuracy by preventing data redundancy and over-fitting, but can also be beneficial in applications such as data compression. The majority of feature selection techniques rely on labelled data. In many real-world scenarios, however, data is only partially labelled and thus requires so-called semi-supervised techniques, which can utilise both labelled and unlabelled data. While unlabelled data is often obtainable in abundance, labelled datasets are smaller and potentially biased. This thesis presents a method called distribution matching, which offers a way to do feature selection in a semi-supervised setup. Distribution matching is a wrapper method, which trains models to select features that best affect model accuracy. It addresses the problem of biased labelled data directly by incorporating unlabelled data into a cost function which approximates expected loss on unseen data. In experiments, the method is shown to successfully minimise the expected loss transparently on a synthetic dataset. Additionally, a comparison with related methods is performed on a more complex EMNIST dataset.
  • Rautsola, Iiro (2019)
    Multimodality imaging is an efficient, non-invasive method for investigation of molecular and cellular processes in vivo. However, the potential of multimodality imaging in plant studies is yet to be fully realized, largely due to the lack of research into suitable molecular tracers and instrumentation. Iodine has PET- and SPECT-compatible radioisotopes that have significant advantages over other radioisotopes applied in plant radioisotope imaging, and can be incorporated into small molecules via a variety of reactions. In this master’s thesis, a radioiodination method exploiting a novel, Dowex® H+-mediated addition of iodine for terminal alkynes was optimized and tested on two D-glucose analogues. The goal of the sugar analogue radioiodination was to develop a radioiodinated molecular tracer for plant carbohydrate metabolism studies. The parameters under optimization were activation Dowex® by HCl, reaction temperature, carrier amount, solvent, and evaporation of excess water. The most optimal results were achieved under the following conditions: Dowex® HCl-activated, reaction temperature 95 °C, amount of carrier 3.0 µmol of carrier, cyclohexanol as solvent, and excess water evaporated. The Dowex® approach was compared to electrophilic reactions with Chloramine T and Iodogen, and it was concluded that the Dowex® approach leads to superior radiochemical yields under the optimized conditions. The Dowex® method was successfully tested on the sugar analogues, resulting in a single main product at a satisfactory 50 – 56 % radiochemical yield. The main products were successfully characterized with NMR, and in addition the method was indicated to be regioselective. It is plausible that the developed method may be improved further in terms of radiochemical yield and molar activity, and that the method could prove to be a useful tool for developing novel radiodinated molecular tracers for plant studies.
  • Kuosmanen, Teemu (2020)
    Cancer is a dynamic and complex microevolutionary process. All attempts of curing cancer thus rely on successfully controlling also the evolving future cancer cell population. Since the emergence of drug resistance severely limits the success of many anti-cancer therapies, especially in the case of the promising targeted therapies, we need urgently better ways of controlling cancer evolution with our treatments to avoid resistance. This thesis characterizes acquired drug resistance as an evolutionary rescue and uses optimal control theory to critically investigate the rationale of aggressive maximum tolerated dose (MTD) therapies that represent the standard of care for first line treatment. Unlike the previous models of drug resistance, which mainly concentrate on minimizing the tumor volume, herein the optimal control problem is reformulated to explicitly minimize the probability of evolutionary rescue, or equivalently, maximizing the extinction probability of the cancer cells. Furthermore, I investigate the effects of drug-induced resistance, where the rate of gaining new resistant cells increases with the dose due to increased genome-wide mutation rate and non-genetic adaptations (such as epigenetic regulation and phenotypic plasticity). This approach not only reflects the biological realism, but also allows to model the cost of control in a quantifiable manner instead of using some ambiguous and incomparable penalty parameter for the cost of treatment. The major finding presented in this thesis is that MTD-style therapies may actually increase the likelihood of an evolutionary rescue even when only modest drug-induced effects are present. This suggests that significant improvements to treatment outcomes may be accomplished at least in some cases by treatment optimization. The resistance promoting properties of different anti-cancer therapies should therefore be properly investigated in experimental and clinical settings.
  • Pakkanen, Noora (2021)
    In Finland, the final disposal of spent nuclear fuel will start in the 2020s where spent nuclear fuel will be disposed 400-450 meters deep into the crystalline bedrock. Disposal will follow Swedish KBS-3 principle where spent nuclear fuel canisters will be protected by multiple barriers, which have been planned to prevent radionuclides´ migration to the surrounding biosphere. With multiple barriers, failure of one barrier will not endanger the isolation of spent nuclear fuel. Insoluble spent nuclear fuel will be stored in ironcopper canisters and placed in vertical tunnels within bedrock. Iron-copper canisters are surrounded with bentonite buffer to protect them from groundwater and from movements of the bedrock. MX-80 bentonite has been proposed to be used as a bentonite buffer in Finnish spent nuclear fuel repository. In a case of canister failure, bentonite buffer is expected to absorb and retain radionuclides originating from the spent nuclear fuel. If salinity of Olkiluoto island´s groundwater would decrease, chemical erosion of bentonite buffer could result in a generation of small particles called colloids. Under suitable conditions, these colloids could act as potential carriers for immobile radionuclides and transport them outside of facility area to the surrounding biosphere. Object of this thesis work was to study the effect of MX-80 bentonite colloids on radionuclide migration within two granitic drill core columns (VGN and KGG) by using two different radionuclides 134Cs and 85Sr. Batch type sorption and desorption experiments were conducted to gain information of sorption mechanisms of two radionuclides as well as of sorption competition between MX-80 bentonite colloids and crushed VGN rock. Colloids were characterized with scanning electron microscopy (SEM) and particle concentrations were determined with dynamic light scattering (DLS). Allard water mixed with MX-80 bentonite powder was used to imitate groundwater conditions of low salinity and colloids. Strontium´s breakthrough from VGN drill core column was found to be successful, whereas caesium did not breakthrough from VGN nor KGG columns. Caesium´s sorption showed more irreversible nature than strontium and was thus retained strongly within both columns. With both radionuclides, presence of colloids did not seem to enhance radionuclide´s migration notably. Breakthrough from columns was affected by both radionuclide properties and colloid filtration within tubes, stagnant pools and fractures. Experiments could be further complemented by conducting batch type sorption experiments with crushed KGG and by introducing new factors to column experiments. The experimental work was carried out at the Department of Chemistry, Radiochemistry in the University of Helsinki.
  • Jussila, Joonas (2019)
    In this thesis, the sputtering of tungsten surfaces is studied under ion irradiation using molecular dynamics simulations. The focus of this work is on the effect of surface orientation and incoming angle on tungsten sputtering yields. We present a simulation approach to simulate sputtering yields of completely random surface orientations. This allows obtaining the total sputtering yields averaged over a large number of arbitrary surface orientations, which are representative to the sputtering yield of a polycrystalline sample with random grain orientations in a statistically meaningful way. In addition, a completely different method was utilised to simulate sputtering yields of tungsten fuzz surfaces with various fuzz structure heights. We observe that the total sputtering yield of the investigated surfaces are clearly dependent on the surface orientation and the sputtering yields of average random surfaces are different compared to the results of any of the low index surfaces or their averages. The low index surfaces and the random surface sputtering yields also show a dependence of the incoming angle of the projectile ions. In addition, we calculate the outgoing angular distribution of sputtered tungsten atoms in every bombardment case, which likewise shows to be a sensitive to the surface orientation. Finally, the effect of fuzz height on the sputtering yield of tungsten fuzz surfaces is discussed. We see that tungsten fuzz significantly reduces the sputtering yield compared to a pristine tungsten surface and the effect is already seen when the fuzz pillar height is a few atom layers.
  • Talonpoika, Ville (2020)
    In recent years, virtual reality devices have entered the mainstream with many gaming-oriented consumer devices. However, the locomotion methods utilized in virtual reality games are yet to gain a standardized form, and different types of games have different requirements for locomotion to optimize player experience. In this thesis, we compare some popular and some uncommon locomotion methods in different game scenarios. We consider their strengths and weaknesses in these scenarios from a game design perspective. We also create suggestions on which kind of locomotion methods would be optimal for different game types. We conducted an experiment with ten participants, seven locomotion methods and five virtual environments to gauge how the locomotion methods compare against each other, utilizing game scenarios requiring timing and precision. Our experiment, while small in scope, produced results we could use to construct useful guidelines for selecting locomotion methods for a virtual reality game. We found that the arm swinger was a favourite for situations where precision and timing was required. Touchpad locomotion was also considered one of the best for its intuitiveness and ease of use. Teleportation is a safe choice for games not requiring a strong feeling of presence.
  • Orozco Ramírez, Lilia Estefanía (2019)
    The European Water Framework Directive aims at restoring all water bodies in good ecological conditions by the year 2023. For this aim, understanding the responses of these ecosystems to current and future pressures is a requisite. Lakes Hältingträsk and Storträsk are located in Östersundom, a latent developing suburban area in eastern Helsinki. Alterations to the catchment in Hältingträsk as a consequence of urbanization will likely change the conditions of the lake. Storträsk, part of Sipoonkorpi nature reserve is primarily influenced by recreational activities. Ecological status of both lakes is likely to alter under the ongoing urban development. For this reason, the reference conditions of Hältingträsk and the resilience of both lakes to human stressors must be assessed. A long term record from Hältingträsk, with special focus on the most recent section, as well as a short core from Storträsk targeting the most recent events, are analyzed for different palaeobiological and geochemical proxies. The sequence from Hältingträsk is evaluated with diatom assemblages, trace metal analyses, lithological description of sediments through loss-on-ignition and inferred chlorophyll a. For Storträsk, a high-resolution study of diatom communities and photosynthetic pigments is performed. Both sequences are framed with an age-depth model based on radiogenic dating techniques. In addition, the results are analyzed with statistical tools and fossil diatom data is used to reconstruct lake water pH. The results describe the evolution of Hältingträsk through the mid-Holocene until recent times; the diatom assemblages indicates the area was part of Ancylus Lake and, later of Litorina Sea, and that it was isolated from the Baltic Basin at 6500 cal BP. This is supported by the high concentrations of Fe and Mn, showing the presence of metallic nodules common in marine environments. The change in sediments and the predominance of fragilarioid diatoms, display the succession of the lake (from gloe to flada). Afterwards, the ontogeny of the lake and the development of surrounding peat bog can be tracked with changes in the diatom community and decrease in heavy metals concentrations. The reconstructed pH reveals that Hältingträsk is a naturally acidic lake. Furthermore, signals of agricultural activities and industrialization are recorded in the area, as well as their development, is recorded through shifts in the diatom community and the oscillation of trace metals of both local (Cu, Ni and V) and long (Pb, Zn and Cd) transport. Finally, climatic anomalies such as the Little Ice Age and current climate warming are imprinted in the diatom assemblages and the photosynthetic pigments. The high resolution of subsampling from Storträsk displayed little variation. The faint changes could be attributed to CaCO3 treatment, fish introduction or recent climate warming. However, discern the influence of each of these stressors was not possible.
  • Spönla, Elisa (2020)
    The aim of the thesis was to study enzymatic treatment as a way to modify paper grade pulp to be a suitable raw material for the future textile industry. Wood as a raw material is an environmentally friendly option for textile production but its sustainable exploitation is not easy. Currently, ionic liquids are assumed to enable a safe and sustainable process for the production of wood-based regenerated fibres. These processes commonly use dissolving pulp as their raw material but replacing dissolving pulp with a paper grade kraft pulp would decrease environmental impact and production expenses. In this work, molar mass distribution of softwood paper grade kraft pulp was selectively modified using enzymes. Enzymes were utilized instead of acids because of their favourable abilities to selectively modify targeted polymers and to increase fibre porosity. Enzymatic modifications of softwood kraft pulp were performed to decrease degree of polymerization of cellulose and lower the quantity of hemicellulose. Hydrolysis of cellulose was catalysed with endo-1,4-β-glucanase (endoglucanase) and hemicellulose was degraded using endo-1,4-β-mannanase and endo-1,4-β-xylanase. The treatments were carried out both at high (20%) and low (3%) pulp consistency to examine the synergistic effect of enzymatic and mechanical action arising in the high consistency treatment. Additionally, influence of different enzyme combinations on the pulp properties was studied. The modified pulp samples were characterized by determining intrinsic viscosity, molar mass distribution, yield loss, and its composition. The fibres were imaged with light microscopy. The degree of polymerization of the pulp cellulose was successfully decreased with a relatively small endoglucanase dose. The amount of hemicellulose was reduced by removing 11% of the total galactoglucomannan and 40% of the total arabinoglucuronoxylan. The high consistency treatments decreased intrinsic viscosity 1.9 times more on average than the low consistency treatments. The high consistency treatments were effective with low enzyme doses, easy to control, and reliably repeated. Therefore, enzymatic pulp treatment at high consistency seems to be a compatible way to modify paper grade kraft pulp to suitable raw material for textile production. Further studies related to pulp dissolution in ionic liquids, fibre spinning, and fibre regeneration should be concluded to confirm applicability of the modified fibres.
  • Mäki-Iso, Emma (2021)
    Sijoitusten markkinariskin suuruutta tarkastellaan usein riskimittojen avulla. Riskimitta on kuvaus mahdollisia tappioita kuvaavien satunnaismuuttujien joukosta reaalilukuihin. Riski- mittojen avulla erilaisten sijotusten riskillisyyttä pystytään vertailemaan helposti. Pankki- valvojat hyödyntävät riskimittoja pankkien vakavaraisuuden valvonnassa. Pisimpään ylei- sessä käytössä on ollut VaR (Value-at-Risk) niminen riskimitta. VaR kertoo suurimman tappion, joka koetaan jollain asetetulla luottamustasolla α eli se on tappiojakauman α- kvantiili. Baselin uusimassa ohjeistuksessa (Minimum capital requirements of market risk) odotettu vaje niminen riskimitta korvaa VaR-riskimitan pääomavaateen laskennassa. Odotet- tu vaje kertoo, mikä on tappion odotusarvo silloin, kun tappio on suurempi kuin VaR- riskimitan antama luku. Riskimittaa ollaan vaihtamassa, koska VaR ei ole teoreettisilta ominaisuuksiltaan yhtä hyvä kuin odotettu vaje. Tämä johtuu siitä, että VaR ei ole sub- additiivinen, mikä tarkoittaa sitä, että positioiden yhteenlaskettu riski voi olla joissain ta- pauksissa suurempi kuin yksittäisten positioiden riskien summa. Tämä johtaa siihen, että hajauttamattoman sijoitussalkun riski voi olla pienempi kuin hajautetun. Odotettu vaje-riskimitta ei kuitenkaan ole täysin ongelmaton, koska se ei ole konsistentisti pisteytyvä, mikä tarkoittaa, että sille ei ole olemassa pisteytysfunktiota, jonka avulla voi- taisiin verrata estimoituja ja toteutuneita arvoja konsistentisti. Lisäksi se, että odotetun vajeen suuruus riippuu kaikista häntään jäävistä tappioista, tekee siitä herkän hännässä olevien tappioiden virheille. Tämä ei ole kovin hyvä ominaisuus, koska tappiojakaumien häntien estimointiin liittyy paljon epävarmuutta. Koska riskien estimointiin liittyy epävarmuutta, sääntely velvoittaa pankkeja toteumates- taamaan regulatiivisen pääomavaateen laskennassa käytettyjä riskiestimaatteja. Toteuma- testaamisella tarkoitetaan prosessia, jossa estimoituja riskilukuja verrataan toteutuneisiin tappioihin. VaR-estimaattien toteumatestaus perustuu niiden päivien lukumäärälle tes- tausjaksolla, joina tappio ylittää VaR-estimaatin antaman tappiotason. Odotetulle vajeelle ei ole vielä olemassa yhtä vakiintuneita toteumatestausmenetelmiä kuin VaR-estimaateille. Tässä tutkielmassa esitellään kolme erilaista tapaa toteumatestata odotettu vaje estimaat- teja, nämä tavat esittelivät Kratz kollegoineen, Moldenhauer ja Pitera sekä Costanzino ja Curran. Menetelmissä tarkastellaan useamman VaR-tason yhtäaikaisia ylityksiä, suojatun position eli tappion ja riskiestimaatin erotuksen positiiviseen lukuun kumuloitu- vasti summautuvien havaintojen määrää ja VaR-ylityksien keskimääräistä suuruutta. Tutkielman laskennallisessa osuudessa tutkittiin antavatko VaR- ja odotettu vaje- toteumatestit samanlaisia tuloksia ja vaikuttaako riskin estimointiin käytetyn havainto- jakson pituus estimaattien suoriutumiseen toteumatesteissä. Laskelmissa havaittiin, että odotettu vaje- ja VaR-toteumatestit antoivat samanlaisia tuloksia. Markkinadatasta eri kokoisilla estimointi-ikkunoilla lasketut estimaatit saivat toteumatestissä erikokoisia tes- tisuureiden arvoja, ja hyväksyivät väärän mallin tai hylkäsivät oikean malli eri todennä- köisyyksillä. Kun käytettiin puhtaasti simuloitua dataa, eri kokoisilla estimointi-ikkunoilla laskettujen estimaattien tuloksissa ei ollut eroja. Näin voidaan päätellä, että testitulosten erot eri mittaisilla havaintojaksoilla laskettujen estimaattien välillä eivät johdu pelkästään havaintojen määrästä vaan myös laadusta.
  • Gonzalez Ateca, Marcos (2020)
    The distribution of matter in space is not homogeneous. Large structures such as galaxy groups, clusters or big empty spaces called voids can be observed at large scales in the Universe. The large scale structure of the Universe will depend on both the cosmological parameters and the dynamics of galaxy formation and evolution. One of the main observables that allow us to quantify this structure is the two-point correlation function, with which we can trace different galaxy properties such as luminosity, stellar mass and also, it enables us to track its evolution with redshift. In galaxy surveys, we do not obtain the location of galaxies in real space. We obtain our data in what it is called redshift space. This redshift space can be defined as a distortion of the real space generated by the redshift introduced by the peculiar velocities of galaxies and from the Hubble expansion of the Universe. Therefore, the distribution of galaxies in redshift space will look different from the one obtained in real space. These differences between both spaces are small but not negligible, and they depend strictly on the cosmology. In this work, we will assume a ΛCDM cosmology. Therefore, in order to find the different 1-dimensional or 2-dimensional correlations functions, we will use the most updated version of the code provided by the Euclid consortium, which belongs officially to the ESA Euclid mission. Moreover, we will also need different galaxy catalogues. These catalogues have already been simulated and they are called Minerva mocks, which are a set of 300 different cosmological mocks produced with N-body simulations. Finally, as there is a well-defined relation between real and redshift space, one could also assume that there is a relation between the two-point correlation functions in both real and redshift space. In this project, we will prove that the real-space one-dimensional two-point correlation function, which is the physically meaningful one, can be derived from the two-dimensional two-point correlation function in redshift space following a geometrical procedure independent of approximations. This method, in theory, should work for all distance scales.
  • Kokkonen, Maija (2019)
    European Union is a notable political actor that strives for governing and producing EU territory through spatial policies and planning. So far, spatial planning has been a technology to govern the terrestrial environment, but now marine space is seen as the new frontier of spatial planning. In 2014, EU has given a directive of maritime spatial planning (MSP), which aims to that every coastal member state had established spatial planning practices to their national marine areas by 2021 according to EU’s spatial agendas. The MSP has been looked at as a managerial tool helping to enhance the ecological condition of the seas, but not as a policy that produces spatiality. In this research, EU’s MSP policy is used as to research Europeanization of space in ‘EU’rope. The aim of this research is to interpret how understanding of ’EU’rope as a territorial entity is shaped through the structure of the maritime spatial planning policy and the meanings attached to it, in order to create a perception of the future development of EU and marine areas in general. The research is conducted from a social constructionist approach as an interpretive policy analysis. The concept of policy integration is in-built to MSP and is used as an indicator to Europeanization in this study. The policy integration effort is seen to steer social networks of actors that create the MSP in practice. Therefore, semi-structured theme interviews were conducted to the actors carrying out the MSP process in Finland. These actors’ understanding of the Finnish MSP is seen to construct ‘EU’ropean space in and through the domestic MSP process. Accordance with the hermeneutic traditions, comprehensive contextualization is conducted in this research in order to understand the maritime spatial planning policy. The research suggests that the spatiality and territoriality of marine areas produces different kind of planning practices than is seen in the terrestrial environment. The EU’s MSP policy is a policy tool for the EU territory, but at the same time, it is used as a tool to carry out domestic regional objectives as well. In Finland, the coastal Regions have benefitted from MSP and gained more power over the Finnish marine territories and the MSP may be used as to reinforce Regional planning. By adopting MSP policy, EU has changed the spatial governance structure of marine Europe. It has transformed heterogenic marine areas in Europe into single entity in order to be spatially governable by EU. These spaces have therefore been submitted under larger decision-making processes than before and EU is able to harness the national marine territories for the benefit of the whole Europe, and mainly due increasing economic growth in the territory. By means of policy integration efforts, the MSP creates new kinds of socio-spatial dimensions to Europe in where political bargaining over domestic marine spaces becomes a norm for the domestic maritime spatial planners. The research suggests that the territorial policy integration efforts reinforce the objectives of the EU directive in transnational collaboration, and this new platform of negotiation can be predicted to unify neighbouring domestic planning practices and goals in some extent.
  • Karis, Peter (2020)
    This thesis presents a user study to evaluate the usability and effectiveness of a novel search interface as compared to a more traditional solution. InnovationMap is a novel search interface by Khalil Klouche, Tuukka Ruotsalo and Giulio Jacucci (University of Helsinki). It is a tool for aiding the user to perform ‘exploratory searching’; a type of search activity where the user is exploring an information space unknown to them and thus cannot form a specific search phrase to perform a traditional ‘lookup’ search as with the conventional search interfaces. In this user study InnovationMap is compared against TUHAT, a search portal that is currently in use at the University of Helsinki for searching for research works and research personnel from the university databases. The user evaluation is conducted as a qualitative within-subject study using volunteer users from the University of Helsinki. Each participant uses both systems in an alternating order over the course of two sessions. During the two sessions the volunteer user carries out information finding tasks defined in the experiment design, answers to a SUS (System Usability Scale) questionnaire and participates in a semi-structured interview. The answers from the assigned tasks are then evaluated and scored by field experts. The combined results from these methods are then used to formulate an educated assessment of the usability, effectiveness and future development potential of the InnovationMap search system.
  • Aula, Kasimir (2019)
    Air pollution is considered to be one of the biggest environmental risks to health, causing symptoms from headache to lung diseases, cardiovascular diseases and cancer. To improve awareness of pollutants, air quality needs to be measured more densely. Low-cost air quality sensors offer one solution to increase the number of air quality monitors. However, they suffer from low accuracy of measurements compared to professional-grade monitoring stations. This thesis applies machine learning techniques to calibrate the values of a low-cost air quality sensor against a reference monitoring station. The calibrated values are then compared to a reference station’s values to compute error after calibration. In the past, the evaluation phase has been carried out very lightly. A novel method of selecting data is presented in this thesis to ensure diverse conditions in training and evaluation data, that would yield a more realistic impression about the capabilities of a calibration model. To better understand the level of performance, selected calibration models were trained with data corresponding to different levels of air pollution and meteorological conditions. Regarding pollution level, using homogeneous training and evaluation data, the error of a calibration model was found to be even 85% lower than when using diverse training and evaluation pollution environment. Also, using diverse meteorological training data instead of more homogeneous data was shown to reduce the size of the error and provide stability on the behavior of calibration models.
  • Dok, Matilda Carol (2020)
    Abstract This thesis explores the everyday spatial practices in the gentrified and micro-segregated Eastleigh, Nairobi. Gentrification is one of the most important aspects of urban studies, as well as social geography contributing to significant socioeconomic changes in many metropolitan cities in the world. Although the emerging empirical studies indicate socioeconomic impacts of gentrification, less research has been conducted to examine social and economic interaction in gentrified spaces in the Global South. Additionally, there are limited studies on how cultural diversity influences gentrification. In the case of a diversified neighbourhood, such as Eastleigh, assessing the effects of culture on gentrification is significant. Therefore, this study aimed to see by observing and interviewing residents, whether the developments in Eastleigh can be analysed and interpreted through the theoretical framework of gentrification and micro-segregation. The study used descriptive research to build on literature and graphics to collect data on gentrification indicators and socioeconomic interactions. The qualitative part of the study entailed observation, questionnaire survey, and key Informant interviews, while quantitative analysis was based on the graphical presentation of data. The outcomes of the study strongly suggest that an increase in the housing variables, the influx of wealthy population, increased employment, and shift in consumption trends are the significant indicators of ongoing gentrification in Eastleigh. The empirical studies indicate that social interactions in gentrified spaces appear to be marginalized due to cultural differences that have a strong impact on social and economic agents. The review made similar observations regarding social interactions between the new-comers and the long-time residents. The results of the study also found out that the reason for social and economic inequalities among the residents and the gentrifies was cultural differences which hindered access to social and economic services. However, since this study is one of the initial studies on gentrification in Eastleigh, Nairobi, more and in-depth studies are recommended
  • Vuoksenmaa, Aleksis Ilari (2020)
    Coagulation equations are evolution equations that model the time-evolution of the size-distribution of particles in systems where colliding particles stick together, or coalesce, to form one larger particle. These equations arise in many areas of science, most prominently in aerosol physics and the study of polymers. In the former case, the colliding particles are small aerosol particles that form ever larger aerosol particles, and in the latter case, the particles are polymers of various sizes. As the system evolves, the density of particles of a specified size changes. The rate of change is specified by two competing factors. On one hand there is a positive contribution coming from smaller particles coalescing to form particles of this specific size. On the other hand, particles of this size can coalesce with other particles to form larger particles, which contributes negatively to the density of particles of this size. Furthermore, if there is no addition of new particles into the system, then the total mass of the particles should remain constant. From these considerations, it follows that the time-evolution of the coagulation equation is specified for every particle size by a difference of two terms which preserve the total mass of the system. The physical properties of the system affect the time evolution via a coagulation kernel, which determines the rate at which particles of different sizes coalesce. A variation of coagulation equations is achieved when we add an injection term to the evolution equation to account for new particles injected into the system. This results in a new evolution equation, a coagulation equation with injection, where the total mass of the system is no longer preserved, as new particles are added into the system at each point in time. Coagulation equations with injection may have non-trivial solutions that are independent of time. The existence of non-trivial stationary solutions has ramifications in aerosol physics, since these might map to observations that the particle size distribution in the air stays approximately constant. In this thesis, it will be demonstrated, following Ferreira et al. (2019), that for any good enough injection term and for suitably picked, compactly supported coagulation kernels, there exists a stationary solution to a regularized version of the coagulation equation. This theorem, which relies heavily on functional analytic tools, is a central step in the proof that certain asymptotically well-behaved kernels have stationary solutions for any prescribed compactly supported injection term.
  • Kangas, Vilma (2020)
    Software testing is an important process when ensuring a program's quality. However, testing has not traditionally been a very substantial part of computer science education. Some attempts to integrate it into the curriculum has been made but best practices still prove to be an open question. This thesis discusses multiple attempts of teaching software testing during the years. It also introduces CrowdSorcerer, a system for gathering programming assignments with tests from students. It has been used in introductory programming courses in University of Helsinki. To study if the students benefit from creating assignments with CrowdSorcerer, we analysed the number of assignments and tests they created and if they correlate with their performance in a testing-related question in the course exam. We also gathered feedback from the students on their experiences from using CrowdSorcerer. Looking at the results, it seems that more research on how to teach testing would be beneficial. Improving CrowdSorcerer would also be a good idea.
  • Kilpeläinen, Wille Julius (2020)
    Inductively coupled mass spectrometry (ICP-MS) is a state-of-the-art technique for elemental analysis. The technique allows fast and simultaneous analysis of multiple elements with a wide dynamic range and low detection limits. However, multiple adjustable parameters and the complex nature ICP-MS instruments can make the development of new analysis methods a tedious process. Design of experiments (DOE) or experimental design is a statistical approach for conducting multi- variate experiments in a way that gives maximal amount of information from each experiment. By using DOE the number of experiments needed for analytical method optimization can be minimized and information about interrelations of di↵erent experimental variables can be obtained. The aim of this thesis is to address the utilization of DOE for ICP-MS method developement as a more e cient mean to optimize analytical methods. The first part of this two part thesis gives an overview on the basics of ICP-MS and DOE. Then a literature review on applying experimental design for ICP-MS method optimization is given and the current state of the research is discussed. In the second part, two new ICP-MS methods for simultaneous determination of 28 elements from six middle distillate fuels, diluted with xylene or kerosine, are presented. The method developement involved optimization of the integration times and optimization of test sample dilution ratios and viscosities using univariate techniques. In addition, experimental designs were succesfully utilized together with desirability approach in multivariate optimizations of the plasma conditions and sample matrix compositions to achieve the best possible analyte recoveries from various matrices.