Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Title

Sort by: Order: Results:

  • Kotipalo, Leo (2023)
    Simulating space plasma on a global scale is computationally demanding due to the system size involved. Modeling regions with variable resolution depending on physical behavior can save computational resources without compromising too much on simulation accuracy. This thesis examines adaptive mesh refinement as a method of optimizing Vlasiator, a global hybrid-Vlasov plasma simulation. Behavior of plasma near the Earth's magnetosphere and different characteristic scales that need to be considered in simulation are introduced. Kinetic models using statistical methods and fluid methods are examined. Modeling electrons kinetically requires resolutions orders of magnitude finer than ions, so in Vlasiator ions are modeled kinetically and electrons as a fluid. This allows for lighter simulation while preserving some kinetic effects. Mesh refinement used in Vlasiator is introduced as a method to save memory and computational work. Due to the structure of the magnetosphere, resolution isn't uniform in the simulation domain, with particularly the tail regions and magnetopause having rapid spatial changes compared to the relatively uniform solar wind. The region to refine is parametrized and static throughout a simulation run. Adaptive mesh refinement based on the simulation data is introduced as an evolution of this method. This provides several benefits: more rigorous optimization of refinement regions, easier reparametrization for different conditions, following dynamic structures and saving computation time in initialization. Refinement is done based on two indices measuring the spatial rate of change of relevant variables and reconnection respectively. The grid is re-refined at set intervals as the simulation runs. Tests similar to production runs show adaptive refinement to be an efficient replacement for static refinement. Refinement parameters produce results similar to the static method, while giving somewhat different refinement regions. Performance is in line with static refinement, and refinement overhead is minor. Further avenues of development are presented, including dynamic refinement intervals.
  • Jalli, Heini (2020)
    Aallokon mittaamiseen Itämerellä on vakiintunut käytettäväksi aaltopoiju, joka on ankkuroitava pintapoiju. Kyseinen havaintotapa aiheuttaa mittauskauden lyhenemistä jäätalven vuoksi. Jotta mittauskautta pystyttäisiin pidentämään, tarvitaan mittaussysteemi, jota ei tarvitse nostaa ylös vedestä ennen jäätalvea. Pintavirtauksia mitataan yleisesti pohjaan asennettavilla akustisilla Doppler virtaus profiilimittalaitteilla (ADCP), joissa ei ole reaaliaikaista tiedonsiirtoa. Viimeisten vuosien aikana lähes reaaliaikaista aineistoa lähettävään aaltopoijuun on lisätty pintavirtauksen havainnoinnin mahdollistavat anturit. Tässä tutkielmassa arvioidaan ADCP:n havaintojen luotettavuutta aallokon mittaamisessa verrattuna aaltopoijuun ja vertaillaan aaltopoijun ja ADCP:n virtaushavaintoja toisiinsa. Tässä tutkielmassa käytetty havaintoaineisto on saatu kahden eri vuoden, kesien 2017 ja 2018, aikana toteutetuista mittalaitevertailuista. Mittausjaksot tehtiin Suomenlahdella Hankoniemen itäpuolella. Havaintoaineistolle on tehty laadunvarmistusta ennen kuin niitä on vertailtu. Laaduntarkastuksen kriteerit on saatu mittalaitteiden valmistajien ilmoittamista raja-arvoista, suosituksista ja kirjallisuudessa olevista Suomenlahden aallokko-olosuhteiden raja-arvoista. Havaintoaineistoa on analysoitu ja verrattu toisiinsa aikasarjojen, hajontakuvaajien ja tilastollisten arvojen kautta. ADCP:n ja aaltopoijun merkitsevän aallonkorkeuden vastaavuus on hyvä, mutta ADCP ei pysty havaitsemaan alle 0,5 metrin aallokkoa luotettavasti. Syvemmälle asennettu ADCP aliarvioi suhteellisen systemaattisesti merkitsevää aallonkorkeutta verrattuna aaltopoijun havaintoihin. Aliarviointia on teoriassa mahdollista korjata ja näin parantaa mittauksien vastaavuutta, mutta käytännössä se ei ole järkevää koska se vaatisi uusien vertailujen tekemistä muuan muassa jokaiselle mittalaitteelle ja -paikalle. Huipun periodin ja aallokon suunnan vastaavuus ei ollut tilastollisesti merkittävää ja ADCP:n mittauksia näistä suureista voisi käyttää, tarvittaisiin tarkempaa spektrien analysointia. Aaltopoijun ja ADCP:n pintavirtaushavaintoja vertaillessa on vastaavuutta arvojen välillä, mutta aallokon kasvaessa erot mittauksissa kasvavat. Havaittuja eroja ei voi selittää pelkästään vertailtavien laitteiden mittaussyvyyden erolla, joka oli keskimäärin 1 metri.
  • Kanarik, Hedi (2018)
    Ilmatieteen laitoksella on runsaasti hankekohtaisesti tehtyjä virtausmittauksia akustisella Dopplerilmiöön perustuvalla ADCP -laitteella. Tällaiset akustiset mittarit pystyvät muita virtausmittareita paremmin mittaamaan laajoja merialueita, joten ne ovatkin maailmanlaajuisesti yksi suosituimmista menetelmistä tarkkailla merien virtauksia. Tärkeimmät ehdot mittausten onnistumiselle on mitatun virtauksen horisontaalinen homogeenisuus, joka ei aina toteudu muun muassa vedessä olevien äänisignaalin sirottajien itsenäisen liikkeen seurauksena. Laite pyrkii jatkuvasti tarkistamaan olosuhteiden riittävän sopivuuden ja poistaa tehokkaasti esimerkiksi mittausalueelle osuneiden kalojen liikkeet. Mikäli laitteen sisäinen laaduntarkkailu on kuitenkin liian tiukka, se saattaa liian helposti hylätä poikkeuksellisempia ilmiöitä, joten tiukempi laaduntarkkailu jätetään usein erikseen tehtäväksi. Tässä tutkielmassa kehitin laaduntarkastusohjelmiston merenpohjaan ankkuroidulle ADCP:lle. Työssä keskitytään erityisesti Ilmatieteen laitoksen käyttämään Teledyne RD Instrument’s -valmistajan Workhorse Sentinel -laitteeseen. Kynnysarvot datan laadulle on määritelty erityisesti tämän valmistajan mittareille ja testit perustuvat laitteen tallentamaan tietoon mittausprosessista. Lähestymistapa perustuu oletukseen, että jos valtaosa virtausnopeuden määrittämisen yhteydessä tehdyistä mittauksista eivät olleet riittävän luotettavia, niin luultavasti loput näistä näennäisesti onnistuneista mittauksista eivät myöskään edusta todellista virtaustilannetta. Laaduntarkistusohjelmisto kehitettiin käyttämällä esimerkkimateriaalina Saaristomerellä Lövskärin risteyksessä vuonna2013 suoritettuja mittauksia. Lövskärin datasetti oli erittäin hyvälaatuista ja epähomogeenisuuden seurauksena datasetistä poistettiin noin 0,3 % mittauksista. Meren ylintä 5 metrin kerrosta ei pystytty mittaamaan voimakkaan sivukeilan aiheuttaman häiriön takia (13 % mittauksista). Datasetissä on huomattavissa selkeää mittausten epävarmuuden kasvua termokliinissä ja yöaikaan, mikä johtuu sirottajina toimivan eläinplanktonin aktiivisuudesta. Yleisesti alueen virtaukset olivat termokliinin seurauksena vahvasti kerrostuneet ja alueella ilmeni syksyllä lyhytkestoisia voimakkaita (lähes 50 cm/s) virtauksia.
  • Pohjonen, Joona (2020)
    Prediction of the pathological T-stage (pT) in men undergoing radical prostatectomy (RP) is crucial for disease management as curative treatment is most likely when prostate cancer (PCa) is organ-confined (OC). Although multiparametric magnetic resonance imaging (MRI) has been shown to predict pT findings and the risk of biochemical recurrence (BCR), none of the currently used nomograms allow the inclusion of MRI variables. This study aims to assess the possible added benefit of MRI when compared to the Memorial Sloan Kettering, Partin table and CAPRA nomograms and a model built from available preoperative clinical variables. Logistic regression is used to assess the added benefit of MRI in the prediction of non-OC disease and Kaplan-Meier survival curves and Cox proportional hazards in the prediction of BCR. For the prediction of non-OC disease, all models with the MRI variables had significantly higher discrimination and net benefit than the models without the MRI variables. For the prediction of BCR, MRI prediction of non-OC disease separated the high-risk group of all nomograms into two groups with significantly different survival curves but in the Cox proportional hazards models the variable was not significantly associated with BCR. Based on the results, it can be concluded that MRI does offer added value to predicting non-OC disease and BCR, although the results for BCR are not as clear as for non-OC disease.
  • Välinen, Lauri (2023)
    Emulsion polymerization is used to make high molecular weight polymers with a fast reaction rate. In emulsion, the temperature is well controlled and the viscosity of the continuous phase remains constant since all polymer chains are inside colloidal particles. Colloid dispersions have the advantage of being used as they are without further purification, which is great for industrial purposes. Emulsion polymerization is also well-scalable to fit the standards of the industry. Adhesives serve an important role in the furniture and construction industry. Many adhesives used for such purposes are derived from non-renewable resources and are not reusable. Additionally, when such strong adhesives are being used in attaching wooden parts, they cannot be separated and once the lifetime of the product is finished, it ends in a landfill. The possibility to remove such strong adhesives from the wooden product would give the wood possibility to be used in other applications. Additionally, the possibility to reapply the adhesive would decrease the amount of adhesive needed to be produced and increase the lifetime of the glue product. In this thesis polyvinyl acetate (PVAc) adhesives are modified by introducing hydrogen bonding units to the polymer chain by copolymerization of vinyl acetate with monomers having urea and bis-urea hydrogen bonding motifs. Comonomers suitable for vinyl acetate are designed, synthesized and characterized.
  • Havukainen, Heikki (2015)
    Managing a telecommunications network requires collecting and processing a large amount of data from the base stations. The current method used by the infrastructure providers is hierarchical and it has significant performance problems. As the amount of traffic within telecommunications networks is expected to continue increasing rapidly in the foreseeable future, these performance problems will become more and more severe. This thesis outlines a distributed publish/subscribe solution that is designed to replace the current method used by the infrastructure providers. In this thesis, we propose an intermediate layer between the base stations and the network management applications which will be built on top of Apache Kafka. The solution will be qualitatively evaluated from different aspects. ACM Computing Classification System (CCS): Networks -> Network management Networks -> Network architectures
  • Siurua, Joel (2023)
    Contacts between individuals play a central part in infectious disease modelling. Social or physical contacts are often determined through surveys. These types of contacts may not accurately represent the truly infectious contacts due to demographic differences in susceptibility and infectivity. In addition, surveyed data is prone to statistical biases and errors. For these reasons, a transmission model based on surveyed contact data may make predictions that are in conflict with real-life observations. The surveyed contact structure must be adjusted to improve the model and produce reliable predictions. The adjustment can be done in multiple different ways. We present five adjustment methods and study how the choice of method impacts a model’s predictions about vaccine effectiveness. The population is stratified into n groups. All five adjustment methods transform the surveyed contact matrix such that its normalised leading eigenvector (the model-predicted stable distribution of infections) matches the observed distribution of infections. The eigenvector method directly adjusts the leading eigenvector. It changes contacts antisymmetrically: if contacts from group i to group j increase, then contacts from j to i decrease, and vice versa. The susceptibility method adjusts the group-specific susceptibility of individuals. The changes in the contact matrix occur row-wise. Analogously, the infectivity method adjusts the group-specific infectivity; changes occur column-wise. The symmetric method adjusts susceptibility and infectivity in equal measure. It changes contacts symmetrically with respect to the main diagonal of the contact matrix. The parametrised weighting method uses a parameter 0 ≤ p ≤ 1 to weight the adjustment between susceptibility and infectivity. It is a generalisation of the susceptibility, infectivity and symmetric methods, which correspond to p = 0, p = 1 and p = 0.5, respectively. For demonstrative purposes, the adjustment methods were applied to a surveyed contact matrix and infection data from the COVID-19 epidemic in Finland. To measure the impact of the method on vaccination effectiveness predictions, the relative reduction of the basic reproduction number was computed for each method using Finnish COVID-19 vaccination data. We found that the eigenvector method has no impact on the relative reduction (compared to the unadjusted baseline case). As for the other methods, the predicted effectiveness of vaccination increased the more infectivity was weighted in the adjustment (that is, the larger the value of the parameter p). In conclusion, our study shows that the choice of adjustment method has an impact on model predictions, namely those about vaccination effectiveness. Thus, the choice should be considered when building infectious disease models. The susceptibility and symmetric methods seem the most natural choices in terms of contact structure. Choosing the ”optimal” method is a potential topic to explore in future research.
  • Ratilainen, Katja-Mari (2023)
    Context: The Bank of Finland, as the national monetary and central bank of Finland, possesses an extensive repository of data that fulfills both the statistical needs of international organizations and the federal requirements. Data scientists within the bank are increasingly interested in investing in machine learning (ML) capabilities to develop predictive models. MLOps offers a set of practices that ensure the reliable and efficient maintenance and deployment of ML models. Objective: In this thesis, we focus on addressing how to implement an ML pipeline within an existing environment. The case study is explorative in nature, with the primary objective of gaining deeper insight into MLOps tools and their practical implementation within the organization. Method: We apply the design science research methodology to divide design and development into six tasks: problem identification, objective definition, design and development, demonstration, evaluation, and communication. Results: We select the tools for the MLOps based on the user requirements and the existing environment, and then we design and develop a simplified end-to-end ML pipeline utilizing the chosen tools. Lastly, we conduct an evaluation to measure the alignment between the selected tools and the initial user requirements.
  • Suuronen, Markus (2021)
    People spend more than 90% of time indoors. That has made the analysis of indoor air quality an subject of interest. There is a growing popularity of miniaturized sample extraction techniques utilizing solid adsorbent materials and thermal desorption allowing direct sample introduction for analysis. This approach is solvent free and there is possibility for reusing adsorbent materials depending of adsorbent properties. This thesis covers the basics of adsorption-desorption process and takes detailed look on different adsorbent materials such as activated carbon (AC), metal-organic framework (MOF) and carbon nanotubes (CNT) and evaluates the effect of surface functionality and pore size distribution for adsorption process. In experimental part, a self-made autosampler functionality and its injection parameters were optimized. The autosampler is able to independently inject up to six in-tube extraction (ITEX) needles with complete desorption. The ITEX was constructed during this experiment with TENAX-GR adsorbent and the repeatability of autosampler and ITEXs were tested and compared to commercial system with extraction of different amines. The effectiveness of this system was also demonstrated for indoor volatile organic compound (VOC) analysis.
  • Leskinen, Juno (2022)
    The continuously evolving cyber threat landscape has become a major concern because sophisticated attacks against systems connected to the Internet have become frequent. The concern is on particular threats that are known as Advanced Persistent Threats (APT). The thesis aims to introduce what APTs are and illustrate other topics under the scope, such as tools and methods attackers can use. Attack models will also be explained, providing example models proposed in the literature. The thesis also introduces which kind of operational objectives attacks can have, and for each objective, one example attack is given that characterizes the objective. In addition, the thesis also uncovers various countermeasures, including most essential security solutions, complemented with more advanced methods. The last countermeasure that the thesis introduces is attribution analysis.
  • Hirvikoski, Kasper (2015)
    Software delivery has evolved notably over the years, starting from plan-driven methodologies and lately moving to principles and practises shaped by Agile and Lean ideologies. The emphasis has moved from thoroughly documenting software requirements to a more people-oriented approach of building software in collaboration with users and experimenting with different approaches. Customers are directly integrated into the process. Users cannot always identify software needs before interacting with actual implementations. Building software is not only about building products in the right way, but also about building the right products. Developers need to experiment with different approaches, directly and indirectly. Not only do users value practical software, but the development process must also emphasise on the quality of the product or service. Development processes have formed to support these ideologies. To enable a short feedback-cycle, features are deployed often to production. A software is primarily delivered through a pipeline consisting of tree stages: development, staging and production. Developers develop features by writing code, verify these by writing related tests, interact and test software in a production-like 'staging' environment, and finally deploy features to production. Many practises have formed to support this deployment pipeline, notably Continuous Integration, Deployment and Experimentation. These practises focus on improving the flow of how software is being developed, tested, deployed and experimented with. The Internet has provided a thriving environment for using new practises. Due to the distributed nature of the web, features can be deployed without the need of any interaction from users. Users might not even notice the change. Obviously, there are other environments where many of these practises are much harder to achieve. Embedded systems, which have a dedicated function within a larger mechanical or electrical system, require hardware to accompany the software. Related processes and environments have their limitations. Hardware development can only be iterative to a certain degree. Producing hardware takes up front design and time. Experimentation is more expensive. Many stringent contexts require processes with assurances and transparency - usually provided by documentation and long-testing phases. In this thesis, I explore how advances in streamlining software delivery on the web has influenced the development of embedded systems. I conducted six interviews with people working on embedded systems, to get their view and incite discussion about the development of embedded systems. Though many concerns and obstacles are presented, the field is struggling with the same issues that Agile and Lean development are trying to resolve. Plan-driven approaches are still used, but distinct features of iterative development can be observed. On the leading edge, organisations are actively working on streamlining software and hardware delivery for embedded systems. Many of the advances are based on how Agile and Lean development are being used for user-focused software, particularly on the web.
  • Trizna, Dmitrijs (2022)
    The detection heuristic in contemporary machine learning Windows malware classifiers is typically based on the static properties of the sample. In contrast, simultaneous utilization of static and behavioral telemetry is vaguely explored. We propose a hybrid model that employs dynamic malware analysis techniques, contextual information as an executable filesystem path on the system, and static representations used in modern state-of-the-art detectors. It does not require an operating system virtualization platform. Instead, it relies on kernel emulation for dynamic analysis. Our model reports enhanced detection heuristic and identify malicious samples, even if none of the separate models express high confidence in categorizing the file as malevolent. For instance, given the $0.05\%$ false positive rate, individual static, dynamic, and contextual model detection rates are $18.04\%$, $37.20\%$, and $15.66\%$. However, we show that composite processing of all three achieves a detection rate of $96.54\%$, above the cumulative performance of individual components. Moreover, simultaneous use of distinct malware analysis techniques address independent unit weaknesses, minimizing false positives and increasing adversarial robustness. Our experiments show a decrease in contemporary adversarial attack evasion rates from $26.06\%$ to $0.35\%$ when behavioral and contextual representations of sample are employed in detection heuristic.
  • Rantama, Jenny (2020)
    The two input rivers of Säkylä’s Lake Pyhäjärvi: Pyhäjoki and Yläneenjoki, were studied with aerial thermal infrared imaging (TIR) analysis and baseflow program, in order to estimate the baseflow in the two rivers. From the helicopter- assisted TIR survey made in July 2011, almost 200 groundwater discharge sites were located in the two studied rivers. The groundwater discharge anomalies were categorized in 5 different classes: 1) spring/springs, 2) cold channel connected to the main channel, 3) diffuse discharge to river, 4) wetland/ wide seepage, 5) unknown anomaly. In addition, a temperature analysis was performed from the studied rivers. In both rivers, pattern of increasing river water temperature from headwaters towards river outlet were discovered with temperature analysis. The baseflow share estimate was made with baseflow filtering program which uses recursive digital filter for signal processing. Mean baseflow share estimation from four years: 2010-2013, were 70 % for River Pyhäjoki and 54 %, for River Yläneenjoki. Larger baseflow portion, lower river water temperature and wide diffuse discharge areas of River Pyhäjoki indicate that Pyhäjoki is more groundwater contributed than River Yläneenjoki. Previous studies made from the Lake Pyhäjärvi catchment have signs of higher groundwater share in River Pyhäjoki catchment, as well. However, TIR and baseflow estimation results of this study have to be dealt with caution. TIR results represent momentary circumstances and GWD locations are interpretations. There are also many factors increasing the uncertainty of the temperature analysis and observations of GWD anomalies. The results of baseflow analysis has to be interpreted carefully too because baseflow filtering is pure signal processing. However, this study shows that River Pyhäjoki and River Yläneenjoki have groundwater contribution. There is a difference in groundwater share in the two studied rivers. In River Pyhäjoki the larger groundwater share (70 %) is related to coarser grained glacial deposits in the river catchment. In TIR results, the influence of headwaters of the River Pyhäjoki, fed by two large springs: Myllylähde and Kankaanranta were emphasized. The two feeding springs are connected to the Säkylä-Virttaankangas esker complex. In River Yläneenjoki catchment, where GW portion was estimated to be smaller (54 %) and GW anomalies where mostly discrete, there are only two little till groundwater areas near the river channel and the catchment is characterized by finer sediments than River Pyhäjoki catchment.
  • Le, Viet (2021)
    Atmospheric aerosol particles absorb and scatter solar radiation, directly altering the Earth’s radiation budget. These particles also have a complex role in weather and climate by changing cloud physical properties such as reflectivity by acting as cloud condensation nuclei or ice nuclei. Aerosol particles in the boundary layer are important because they pose a negative impact on air quality and human health. In addition, elevated aerosol from volcanic dust or desert dust present an imminent threat to aviation safety. To improve our understanding of the role of aerosol in influencing climate and the capability to detect volcanic ash, a ground-based network of Halo Doppler lidars at a wavelength of 1565 nm is used to collect data of atmospheric vertical profiles across Finland. By comparing the theoretical values of depolarization ratio of liquid clouds with the observed values, bleed through of each lidar is detected and corrected to improve data quality. The background noise levels of these lidars are also collected to assess their stability and durability. A robust classification algorithm is created to extract aerosol depolarization ratios from the data to calculate overall statistics. This study finds that bleed through is at 0.017 ± 0.0072 for the Uto-32 lidar and 0.0121 ± 0.0071 for the Uto-32XR lidar. By examining the time series of background noise level, these instruments are also found to be stable and durable. The results from the classification algorithm show that it successfully classified aerosol, cloud, and precipitation even on days with high turbulence. Depolarization ratios of aerosol across all the sites are extracted and their means are found to be at 0.055 ± 0.076 in Uto, 0.076 ± 0.090 in Hyytiala, 0.076 ± 0.071 in Vehmasmaki and 0.041 ± 0.089 in Sodankyla. These mean depolarization ratios are found to vary by season and location. They peak during summer, when pollen is abundant, but they remain at the lowest in the winter. As Sodankylä is located in the Artic, it has aerosols with lower depolarization ratio than other sites in most years. This study found that in summer, aerosol depolarization ratio is positively correlated with relative humidity and negatively correlated with height. No conclusion was drawn as to what processes play a more important role in these correlations. This study offers an overview of depolarization ratio for aerosol at a wavelength of 1565 nm, which is not commonly reported in literature. This opens a new possibility of using Doppler lidars for aerosol measurements to support air quality and the safety of aviation. Further research can be done test the capability of depolarization ratio at this wavelength to differentiate elevated aerosol such as dust, pollution, volcanic ash from boundary layer aerosol.
  • Heikkinen, Liine Maria (2016)
    Ilmakehän pienhiukkaset, eli aerosolihiukkaset, vaikuttavat maapallon säteilypakotteeseen riippuen hiukkasen kemiallisesta koostumuksesta. Kemiallinen koostumus ohjaa yli 50 nanometristen hiukkasten taipumusta joko sirottaa tai absorboida auringonsäteilyä. Toisaalta hiukkaset voivat myös osallistua pilvenmuodostukseen, jos ne ovat koostumukseltaan kyllin hapettuneita ja siksi pystyvät sitomaan ympärilleen vesimolekyylejä. Ilman aerosolihiukkasia maapallo olisi varmasti paljon lämpimämpi, sillä sekä suora säteilyn sirottaminen että epäsuora sirottaminen pilvien kautta ovat tärkeitä ilmakehän viilennysmekanismeja. Auringonsäteilyn määrä ja siten pintalämpötilat ohjaavat ilmakehän pienhiukkasten pitoisuuksia ihmisten ja luonnon kautta. Hiukkasten primäärilähteitä ja lähtöaineita on valtava kirjo, joka luo laajan hiukkasten fysiokemiallisten ominaisuuksien kokoelman. Tässä työssä esitellään tutkimus aerosolihiukkasten kemiallisen koostumuksen vuodenaikaisvaihtelusta SMEAR II -asemalla, Etelä-Suomessa, jossa lämpötilan vuodenaikaisvaihtelu on suurta. Työhön liittyvissä mittauksissa hyödynnettiin massaspektrometriaa ja in situ -suodatinmittauksia. Aerosolikemiamittaukset kuuluvat SMEAR II -aseman rutiinimittauksiin, ja niitä on tehty jatkuvasti vuodesta 2012 lähtien. Tässä tutkielmassa analysoinnin kohteena ovat vuoden 2014 neljä termistä vuodenaikaa, jotka sijoittuivat tarkemmin ajanjaksolle 23.1. - 27.10.2014. Tutkimuksen analyysimenetelmät todettiin hyödyllisiksi ja niitä sovelletaan tulevaisuudessa koko nelivuotisen aikasarjan (2012 - 2015) analyysiin. Työssä todettiin aerosolihiukkasten koostuvan kesällä pääasiassa luontoperäisistä orgaanisista yhdisteistä (~77 %) ja talvella enimmäkseen epäorgaanisista yhdisteistä (~54 % ), jotka olivat kaukokulkeumaa kaupungeista. Sulfaatti dominoi tasaisesti epäorgaanista massaa vuodenajasta riippumatta. Ilmakehän hapetuskapasiteetti heijastui kesällä orgaanisen aerosolin haihtuvuustasoon madaltaen sitä huomattavasti. Kesällä puolihaihtuvalla orgaanisella aerosolilla ja nitraatilla oli selkeä vuorokausisykli, jossa valtaosa massasta oli hiukkasfaasissa yöllä ja katosi kaasufaasiin keskipäivällä. Vastaavanlaista käyttäytymistä ei orgaanisilla aerosolityypeillä ollut havaittavissa talvella, mikä johtui toisaalta kylmemmistä lämpötiloista ja toisaalta myös orgaanisen aerosolin erilaisesta koostumuksesta. Epäorgaanisten yhdisteiden vuorokausisyklien tulkinta oli haastavaa, johtuen kaukokulkeuman ajallisten vaihteluiden runsaudesta, joka tulevaisuuden laajemmassa analyysissä saadaan minimoitua.
  • Luoma, Krista (2017)
    Maapallon ilmakehä sisältää mikroskooppisen pieniä aerosolihiukkasia, joilla on vaikutus ihmisten terveyteen sekä maapallon ilmastoon. Aerosolihiukkaset vaikuttavat ilmastoon vuorovaikuttamalla auringon säteilyn kanssa sekä osallistumalla pilvien muodostumisprosessiin. Tässä tutkielmassa keskitytään aerosolihiukkasten optisiin ominaisuuksiin, joilla tarkoitetaan niiden kykyä sirottaa sekä absorboida säteilyä eri aallonpituuksilla. Tutkielmassa käsitellään SMEAR II -asemalla suoritettuja in-situ mittauksia aerosolihiukkasten sironnasta, takaisinsironnasta ja absorptiosta, joita on saatavilla jo vuodesta 2006 alkaen. Mitattujen sironta-, takaisinsironta- sekä absorptiokertoimilla aerosolihiukkasille voidaan määrittää niiden kokojakaumaa ja koostumusta kuvaavia intensiivisiä suureita. Pitkäaikaisten mittausten avulla optisille ominaisuuksille nähtiin trendejä sekä selkeää vuodenaikaisvaihtelua. Vertaamalla optisia mittauksia kokojakauman mittauksiin, aerosolihiukkasille voitiin määrittää kompleksinen taitekerroin, jota käytetään mallinnettaessa Mie-sirontaa ja -absorptiota. Absorptiokertoimen mittauksia verrattiin lisäksi alkuainehiilen mittauksiin, jolloin voitiin määrittää aerosolihiukkasten massa-absorptioala. Massa-absorptioalan avulla voidaan määrittää mustan hiilen pitoisuus absorptiokertoimen mittauksista. SMEAR II -asemalla absorptiokerrointa on mitattu kolmella eri mittalaitteella (etalometri, Particle Soot Absorption Photometer (PSAP) sekä Multi-Angle Absorption Photometer (MAAP)), joita vertailtiin keskenään. Vertaamalla etalometrin sekä MAAPin mittaustuloksia toisiinsa määritettiin etalometrimittauksiin liittyville korjausalgoritmeille suodattimen moninkertaissirontaa kuvaavat parametrit SMEAR II -aseman olosuhteisiin. Tutkielmassa suoritettiin myös optinen sulkeuma ekstinktio-, sironta- sekä absorptiokertoimien mittaustuloksia vertailemalla. Tulosten perusteella ekstinktiota mittaavan Cavity Attenuated Phase Shift -ekstinktiomonitorin (CAPS) sekä sirontaa mittaavan integroivan nefelometrin mittaustarkkuudet eivät riitä mittaamaan aerosolifaasissa olevien hiukkasten absorptiota.
  • Jokela, Jesse (2018)
    Tämän pro gradu –tutkielman kirjallinen osuus on katsaus aerosolimassaspektrometrian kehitystä ja sen sovelluksia käsittelevään kirjallisuuteen viimeisen noin kymmenen vuoden ajalta. Aerosolimassaspektrometrialla tarkoitetaan massaspektrometrian soveltamista aerosolihiukkasten kokojaotellun kemiallisen koostumuksen mittaukseen. Erilaiset aerosolimassaspektrometrit on tutkielmassa jaoteltu kahteen ryhmään sen perusteella, analysoidaanko laitteella yksittäisiä hiukkasia vai kerrallaan ryhmittäin useita tietyn kokoisia lyhyen ajanjakson aikana kerättyjä hiukkasia. Hiukkasia ryhmittäin analysoivat aerosolimassaspektrometrit käyttävät yleensä lämpöhöyrystystä hiukkasten desorboimiseksi ennen ionisointia, kun taas yksittäisiä hiukkasia analysoivat aerosolimassaspektrometrit desorboivat hiukkaset yksi kerrallaan käyttäen tähän tyypillisesti pulssitettua laseria. Aerosolimassaspektrometreilla voidaan tehdä monenlaista ilmakehätutkimusta. Tutkielmassa on tuotu esille erityisesti orgaanisten aerosolien ja niiden alkuainekoostumuksen, merisuolan, metallien ja muiden hivenaineiden tutkimuksia. Omat kappaleensa on annettu aerosolihiukkasten kemiallista koostumusta mittaavalle monitorille sekä aerosolimassaspektrometriin liitettäville laitteille, joita ovat esimerkiksi lämpödesorptioaerosolikaasukromatografi, potentiaalinen aerosolimassakammio ja valonsironta-moduuli. Kirjallisuuskatsauksen lopussa on käsitelty lyhyesti aerosolien lähdeanalyysejä. Ilmakehän aerosolien nukleaation, kasvun ja ikääntymisen mekanismien ja kinetiikan kvantitointi ja selventäminen edelleen ovat tarpeen, koska ilmakehän aerosolihiukkasilla on vaikutuksia maapallon ilmastoon, paikallisiin ilmansaasteisiin ja ihmisten terveyteen. Aerosolimassaspektrometreilla on keskenään hyvin erilaisia ominaisuuksia, eikä ole olemassa yhtä jokaiseen tilanteeseen soveltuvaa aerosolimassaspektrometria. Mittaukset samanaikaisesti usealla eri tyyppisellä aerosolimassaspektrometrilla ja perinteisillä analyysilaitteistoilla täydentävät toisiaan ja parantavat kokonaisvaltaisesti ymmärrystä eri menetelmistä ja hiukkasten ominaisuuksista. Tutkielman kokeellisen osuuden tarkoituksena oli mitata hiukkasten kokojaoteltuja kemiallisia koostumuksia kaupunkiympäristössä näytteenkeräys- ja suoramittausmenetelmillä, ja verrata menetelmiä keskenään. Kenttämittauksia suoritettiin nokiaerosolimassaspektrometrilla (SP-AMS). Kaskadi-impaktorilla suodatinkalvoille kerättyjä kokojaoteltuja aerosolinäytteitä analysoitiin laboratoriossa ionikromatografilla (IC). Kokeellisen osuuden suoramittaus- ja näytteenkeräysmenetelmillä mitatut kokojakaumat vastasivat pääosin melko hyvin toisiaan. Suoramittausmenetelmien ylivertainen aikaresoluutio mahdollistaa ilmakehän pienhiukkasten luonnollisten lähteiden, ihmisen aiheuttamien päästöjen sekä sääolosuhteiden vaikutusten tutkimisen huomattavan tarkasti verrattuna näytteenkeräysmenetelmiin. Virheen määrittäminen on aerosolimassaspektrometrimittauksissa selvästi vaikeampaa kuin ionikromatografia-analyyseissä. Voidaan kuitenkin varmasti sanoa, että johtuen analyysivaiheiden lukumäärästä virhe IC- ja IC-MS –mittauksissa on huomattavasti suurempi kuin SP-AMS –mittauksissa.
  • Khansari, Marzieh (2018)
    The climate feedback is a response of the climate system to a perturbation through a number of mechanisms. Perturbations can be due to natural factors, like volcanic activity or changes in solar activity, or anthropogenic such as emissions of long-lived greenhouse gases and aerosol particles. Atmospheric aerosols affect the Earth’s radiation budget. The aerosols impact radiation directly by scattering and absorbing incoming solar radiation and indirectly by changing cloud properties via formation of cloud condensation nuclei. Here, the aerosol radiation feedback loop associated to the continental biosphere-aerosol-cloud-climate (COBACC) feedback loop, is suggested. This negative feedback loop connects increasing atmospheric CO2 concentration, rising temperatures, the formation of aerosol particles due to the emission of biogenic volatile organic compounds, changes in ratio of diffuse to global radiation in the clear sky condition, and changes in the plant gross primary production. In this study, in-situ atmospheric measurement data in Hyytiälä station, as well as satellite atmospheric measurement data (CERES (Clouds and the Earth’s Radiant Energy System) and MODIS (Moderate Resolution Imaging Spectroradiometer instrument)) around Hyytiälä station and a small area in the western plain of Siberia for clear sky conditions in June and July around noon, were used. Three methods for detecting clear-sky conditions were considered: brightness parameter, global radiation smoothing and lastly MODIS cloud mask method. Here, MODIS cloud mask method was selected as the most suitable method due to availability of data and global coverage. This study proved partly the existence of the aerosol radiation feedback loop by finding positive correlation between some of the components of the feedback loop, such as condensation sink(CS) and temperature, ratio of diffuse radiation to global radiation (R) and CS, and R and temperature. Additionally, it was shown that satellite-based data compares well with in-situ data. Hence, it is possible to use satellite-based data for the aerosol-radiation feedback loop. In addition, the impact of relative humidity on the relation between R and temperature was investigated. It was found that it is important to take into account the swelling effect in order to investigate the relation between R and temperature. In contrast, solar zenith angle does not have an impact on the relation during study period (June – July).
  • Kuna, Kamilla (2021)
    This research studies the environmental aesthetics of boredom in the light of post-soviet neighborhoods. While belonging to the grey zone in aesthetics, boredom is an integral part of mundane life that challenges us to notice the uniqueness of our everydayness. Even though mass housing provided an economically feasible solution for sheltering millions of people, it lacks the qualities that meet contemporary living and energy standards. Soviet mass housing architecture and microdistrict designs were led by the economic conditions in the first place. Mass housing was not meant to speak, whereas the residents should not be silenced because of it. We tend to preserve what we find aesthetically pleasing (Nassauer, 1997), yet, what is aesthetically pleasing and what about other values? In this project, preservation is essential for building a contemporary-oriented mindset that could lead to more sustainable and inclusive neighborhoods. The project aims to tackle the issue of Soviet districts through the eyes of its residents, contrasting the more often used top-down approach. Here I challenge the residents' perception of their neighborhood and create a moment for reflection. By offering this space, I amplify the voices of the real experts, the ones knowing their microdistrict inside out. Environmental aesthetics is a relatively new concept within the contemporary urban planning scholarship, giving a fresh take on subjective experiences of urban settings that unveil profoundly rooted and often disguised problems. The interdisciplinarity in the research is met by merging disciplines such as sociology, urban aesthetics, urban history, and philosophy. The term boredom belonging to positive or negative aesthetic values is questioned the same as the legacy of Soviet mass housing later in the research. The ideology behind Soviet blocks is discussed, creating a common ground for diverse readers. Inclusivity is brought with resident participation through the visual research method - photovoice. To avoid biased data, the resident experiences are supported with the city planner's point of view and secondary quantitative data. The findings include Laumas microdistrict resident photovoice analysis, putting the Laumas microdistrict residents as the primary information providers. Taking pictures of their everyday surroundings, residents are given space to show how they see their microdistrict, outlining the values they are proud of or giving perspective of what needs to be changed. Instead of one-sided creation, the curation is inclusive and more reflective of the urban environment's already existing residents' values of their neighborhood. Resident aesthetic preferences open broader discussion on the maintenance issues of microdistricts facing nowadays. Topics are various, but the primary outcomes discussed built environment aesthetics, renovation, communication, identity, resident initiative, automobile domination, and natural environment aesthetics. In the final part of the study, some possible directions for changing microdistrict are pointed out, and further research questions are presented. The project is incomplete until it reaches a broader audience and provides knowledge to politicians, city planners, and other residents.
  • Niu, Yimeng (2016)
    While health establishes the basis of our life, at times we need to visit doctors or hospitals. On that, patients may be faced with inequalities, for example, due to distances to the healthcare resources. With the development of telecommunications and the internet of things, telemedicine may assist in such cases, saving travel time and cost. This thesis suggests a telemedicine monitoring solution for both hospital based and personal users. The focus is on the architecture of the system, the role of wireless sensors in telemedicine and telemedicine key technologies (such as: Bluetooth and ZigBee). Further, the software structure for monitoring the patients' physiological state remotely at hospital and at home is suggested. This involves also the choice of suitable hardware for data acquisition and wireless transmission. In the end, other related scientific researches are discussed. Comparisons are made between the proposed solution and other similar designs in different angles depending on the focuses of other research work, such as processing performance, connectivity, usability, unit price, data security and decision making.