Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Title

Sort by: Order: Results:

  • Ruottu, Toni (Helsingin yliopistoHelsingfors universitetUniversity of Helsinki, 2011)
    As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
  • Kauppala, Tuuli (2021)
    Children’s height and weight development remains a subject of interest especially due to increasing prevalence of overweight and obesity in the children. With statistical modeling, height and weight development can be examined as separate or connected outcomes, aiding with understanding of the phenomenon of growth. As biological connection between height and weight development can be assumed, their joint modeling is expected to be beneficial. One more advantage of joint modeling is its convenience of the Body Mass Index (BMI) prediction. In the thesis, we modeled longitudinal data of children’s heights and weights of the dataset obtained from Finlapset register of the Institute of Health and Welfare (THL). The research aims were to predict the modeled quantities together with the BMI, interpret the obtained parameters with relation to the phenomenon of growth, as well as to investigate the impact of municipalities on to the growth of children. The dataset’s irregular, register-based nature together with positively skewed, heteroschedastic weight distributions and within- and between-subject variability suggested Hierarchical Linear Models (HLMs) as the modeling method of choice. We used HLMs in Bayesian setting with the benefits of incorporating existing knowledge, and obtaining full posterior predictive distribution for the outcome variables. HLMs were compared with the less suitable classical linear regression model, and bivariate and univariate HLMs with or without area as a covariate were compared in terms of their posterior predictive precision and accuracy. One of the main research questions was the model’s ability to predict the BMI of the child, which we assessed with various posterior predictive checks (PPC). The most suitable model was used to estimate growth parameters of 2-6 year old males and females in Vihti, Kirkkonummi and Tuusula. With the parameter estimates, we could compare growth of males and females, assess the differences of within-subject and between-subject variability on growth and examine correlation between height and weight development. Based on the work, we could conclude that the bivariate HLM constructed provided the most accurate and precise predictions especially for the BMI. The area covariates did not provide additional advantage to the models. Overall, Bayesian HLMs are a suitable tool for the register-based dataset of the work, and together with log-transformation of height and weight they can be used to model skewed and heteroschedastic longitudinal data. However, the modeling would ideally require more observations per individual than we had, and proper out-of-sample predictive evaluation would ensure that current models are not over-fitted with regards to the data. Nevertheless, the built models can already provide insight into contemporary Finnish childhood growth and to simulate and create predictions for the future BMI population distributions.
  • Tse, Yu Tat (2018)
    The literature part of this thesis contains the review of development of portable gas chromatograph (GC) and its application in gas analysis. The scope includes portable capillary GC and chip-based GC. Gas chromatography is a separation technique based on different retention behavior of compounds in stationary phase. The use of portable GC enables chemists to carry out rapid on-site chemical analysis. Rapid, on-site analyses are valuable in fields such as air quality monitoring, emergency reaction and forensic application. Studies have shown that performance of portable GC analysis in these fields was as promising as conventional, bench-top GC analysis. The aims of portable GC development were mainly improved separation efficiency, faster analysis, greater portability, reduced power consumption, increased autonomous time and lower detection limit. Different components of portable GC are reviewed: they are separating channels/columns and stationary phases, temperature programming system, pre-concentrator, injector and detector. Semi-packed column and materials with great surface-area-to-volume ratio as stationary phase support were researched to increase surface area of retention. An improved separation efficiency was observed. Multi-channel capillary chips were fabricated to increase sample capacity of the column. Resistive heating was used in portable GC to provide high heating rate. This enables high efficiency separation in fast GC analysis. Efforts were made to reduce power consumption of the heating system to increase portable time. Using ambient air as the carrier gas eliminate the need of helium gas tank in the portable GC system. Researches were done to overcome the limitations of using ambient air. Vacuum-outlet GC technique was used to speed up the analysis. Air purification method was discussed to provide stable supply of clean air. Stability of stationary phase in ambient air was compared. A pre-concentrator is always used to lower the detection limit of gas analysis. Solid-phase microextraction (SPME) devices were commonly used. Micro-fabricated pre-concentrators were designed to enrich the analyte on-line prior to sample injection. The type of adsorbent in pre-concentrator and methods to achieve selectivity were discussed. Miniaturized detectors reported in portable GC were reviewed. Changes in the detector design were made to enhance signal quality and sensitivity in various detectors. They were made very small and light to increase portability of the GC. At last, portable GCxGC system is also mentioned. GCxGC has higher separation power than one-dimensional GC system. It allows chemist to separate analytes from complicated matrixes. Pneumatic and thermal modulation that transfer analyte bands from column to column was described. The advantages of adaptive GCxGC were also explained. The experimental part of this thesis describes a standard gas generation system of volatile organic compounds (VOCs) and its use in VOCs quantitation with internal standard, using SPME arrow as the sampler. The standard generation system was based on diffusion of analyte vapour through a deactivated capillary out of a GC vial. The vapour was carried away by nitrogen gas then diluted in various mixing ratio with nitrogen gas. The standard gas generation system can produce gas standard from any compound with high vapour pressure. The concentration of gas standard generated was validated with liquid standard. After choosing the appropriate sampling time with SPME arrow, calibration curves were constructed with conventional GC-MS and portable capillary GC-MS. Internal standard, octanal in this experiment, was generated using similar method. At various dilution factor of VOC standard, the peak area of internal standard was similar despite fluctuations. It showed the mass of internal standard extracted on the SPME arrow was relatively constant in different points on the calibration curves. Calibration curves of the VOCs with internal standard showed an improved correlation coefficient compared to calibration curves of the same VOCs without internal standard. It showed the use of internal standard in the can compensate the errors during sampling procedures. For real sample analysis, VOCs emitted from lemon sample was analyzed using the system to estimate emission rate of VOCs from lemon sample. Possible add-ons to the system were also discussed to make the system portable and reduce the uncertainty arising from variation of temperature and humidity in air during air sampling.
  • Pandey, Bivek (2024)
    The concept of digital twins, proposed over a decade ago, has recently gathered increasing attention from both industry and academia. Digital twins are real-time or near-real-time simulations of their physical counterparts and can be implemented across various sectors. In mobile networks, digital twins are valuable for maintenance, long-term planning, and expansion by simulating the effects of new infrastructure and technology upgrades. This capability enables network operators to make informed investment and growth decisions. Challenges in implementing digital twins for mobile networks include resource limitations on mobile devices and scaling the system to a broader level. This thesis introduces a modular and flexible architecture for representing network signals from mobile devices within a digital twin environment. It also proposes a suitable platform for digital twins of mobile network signals and resource-efficient protocols for data transmission. The focus is on developing solutions that ensure scalable and resource-efficient synchronization of real or near-real-time data between digital twins and their physical counterparts. The architecture was evaluated through performance testing in two setups: one where data preprocessing occurs on the devices, and another where preprocessing is entirely offloaded to the digital twin platform. Additionally, scalability was assessed by analyzing the platform's ability to handle connections and data transfer from multiple devices simultaneously. The results demonstrate the system's effectiveness and scalability, providing insights into its practical application in real-world scenarios. These findings underscore the potential for widespread adoption and further development of digital twin technologies in mobile networks.
  • Pfeil, Rebecca Katharina (2024)
    Single-cell RNA sequencing (scRNA-seq) allows the analysis of differences in the RNA expression between individual cells. While this is usually performed by short read sequencing, long read sequencing like Pacific Biosciences (PacBio) and Oxford Nanopore (ONT) is also applied by researchers. As long read technologies allow to capture entire RNA molecules, combined with single-cell sequencing, this enables the exploration of cell-specific isoform expression patterns. In single-cell sequencing each cell is tagged by a different oligonucleotide, called barcode, during sequencing, to enable the identification of the origin of each read. With short reads, these are straighforward to identify and correct. However, with the higher error rate of long reads, the identification of the barcodes becomes more challenging. Tools exist for the identification and correction of barcodes in short reads and for combinations of long and short reads, but only few tools work with long reads exclusively. Additionally, most tools are focused on one specific scRNA-seq protocol. While most protocols work in a similar way, the location, length or other characteristics of the barcodes might differ, meaning not all tools work for all protocols. This thesis introduces a novel barcode calling algorithm for long reads called BArcoDe callinG via Edit distance gRaph, or Badger, which can accomodate for different scRNAseq protocols. The algorithm uses a novel data structure called edit distance graph, which is based on the Hamming distance graph. Within the graph, every distinct barcode is represented by a node. Edges are added between nodes where the represented barcodes have an edit distance below a certain threshold between them. As calculating the edit distance is computationally expensive, a filter is used to find similar barcodes, and only between those the edit distance is calculated. Additionally, the algorithm is implemented and its performance evaluated, both on its own and in comparison to the existing method scTagger, where Badger outperforms scTagger in both precision and recall.
  • Aalto, Aleksi Jeremias (2017)
    Geologic knowledge is often inferred from heterogeneous and sparse datasets. Thus, integrating disparate data is one of the essential phases in geologic research. Integrating geologic and geophysical observations and models is typically performed using proprietary modeling software. The methods for integrating data often utilise concepts based on the relational model for data. The relational model provides the theoretical basis for designing data management systems where the consistency of data is ensured. In this study, the relational model was used for geologic and geophysical data by designing a data model for integrating reflection seismic data from the Finnish Reflection Experiment together with chosen geophysical and geologic data. The data model was implemented in a relational database management system and the data is handled in a manner that ensures the internal and referential consistency of the data. A web GIS application was designed and implemented to visualise the Finnish Reflection Experiment data together with other relevant datasets. The application utilises a service-oriented architecture, where external services providing data or features are utilised to enhance the capabilities of the application. The external services are accessed using widely standardised web technologies and thus it is easy to extend or alter the datasets used in the application. The web GIS application makes it possible to visualise Finnish Reflection Experiment data together with other data using any device with a web browser. The web GIS application has been published as a part of the OpenFIRE service, running in the AVAA portal of the Open Science and Research Initiative. In the OpenFIRE service, additional download service has been designed for the application in collaboration with the AVAA team of the Open Science and Resarch Initiative. In the service, the web GIS application can be used to browse Finnish Reflection Experiment data in a proper context together with other datasets. By modeling data it was possible to improve the quality of the national geologic data repository from which data was used in the web GIS application. The data management processes of the national data repository are recommended to be reviewed using the data management theory methods discussed in this study. Considering that there are currently various ongoing data and research infrastructure programmes without many prototypes in production, the methods discussed and utilised in this study could be considered as examples of design and implementation of a domain-specific data visualisation service and data management practices, especially in the fields of bedrock geology and solid Earth geophysics.
  • Nissinen, Tuomas (2015)
    Particle Induced X-ray Emission (PIXE) is an ion beam analysis technique. In PIXE, atoms in the sample are excited when the sample is bombarded with protons, alpha particles, or heavy ions. X-rays are emitted when atoms in the sample de-excite. Each element has unique characteristic x-rays. In the spectrum, area of each peak is proportional to the elemental concentration in the sample. The existing PIXE set-up in the accelerator laboratory was upgraded to external beam PIXE to do in air measurements, because of need to analyse large amounts of archaeological samples. Different exit window set-ups were constructed and tested. The goal was to get maximum beam spot area with minimum beam energy loss in the exit window. The set-up enables the use of 100 nm thick Si3N4 exit window membranes and 4-mm-diameter beam spot area. For the measurements in the current work, a 500 nm thick Si3N4 membrane was used due to its higher durability. Current measurement can be difficult when doing PIXE in air because of ionization of air molecules in the beam's path and charge collection differences at sample surface. The set-up utilizes a beam profile monitor (BPM), which measures the current in vacuum prior to the exit window, and therefore is not affected by the current measurement difficulties in air. Along with the BPM, a current integrator was also used in the current measurements. Current integrator was used to collect the charge from the sample holder. These two methods together provided reliable way of current measurement. With the developed set-up, 166 pottery pieces from the neolithic stone age from different parts of Finland, Sweden and Estonia, were measured to determine their elemental concentrations for provenance research. AXIL software was used to analyse the spectra.
  • Kettunen, Ilkka Henrikki (2022)
    Aim of this study is to develop biogeochemical exploration methods for cobalt. Several different samples were collected from study area, analyzed, and compared to each other. This study took place at Rautio village at North Ostrobothnia and more accurately over the Jouhineva mineralization. Jouhineva is well-known high-grade cobalt-copper-gold mineralization. Elements examined in this study are cobalt, copper, arsenic, zinc, selenium, and cadmium. Samples were collected from three different study profiles from the area. From these three profiles samples collected are: soil, pine, lingonberry, birch, rowan, and juniper. Water samples were collected around the study area from every location possible. Soil samples were analyzed with four different methods: Ionic leaching, aqua regia, weak leaching and pXRF. Ionic leaching and aqua regia had both elevated concentrations of cobalt, but in different locations depending on study profile. Ionic leaching detects rising ions from the ore and therefore elevated concentrations are found at different locations compared to aqua regia. Aqua regia results proved how different orientation of study profile, direction of the ore and glacial flow can affect to the anomalies of elemental concentration. Profile-2 was oriented differently to ore and glacial flow than Profile-1, and therefore elevated concentrations of cobalt and copper were not drifted away from the ore on Profile-2 like they were on Profile-1. Aqua regia and pXRF have very similar copper, arsenic and zinc results. Pine and lingonberry turn out to be the most promising plant species applied for cobalt exploration, and rowan appears to be most suitable for copper exploration. Lower detection limit could significantly improve pine analyses as exploration method and more extensive sampling could remove some of the uncertainties about the method. Lingonberry samples have elevated concentration of copper and arsenic. Birch and juniper produced somewhat unclear results. Despite this, cobalt and copper concentrations in birch leaves were elevated when compared to concentrations found in other studies. In addition to this birch is suitable for arsenic exploration. Juniper had elevated copper concentration in the study area compared to other studies. Water samples collected from the Jouhineva area yielded concentrations of cobalt, copper and arsenic that were above the average concentration in the Kalajoki area waters. Copper and arsenic were above the average concentration of the Kalajoki area in every sample collected from the study area. Cobalt was above the average concentration in all samples that were not collected directly from the pond formed in the old test mine. Zinc concentration was below the average limit in all samples collected from the area. Zinc concentration in the water samples collected from the pond is significantly lower compared to the other samples collected from the area.
  • Byggmästar, Jesper (2016)
    Interatomic potentials are used to describe the motion of the individual atoms in atomistic simulations. An accurate treatment of the interatomic forces in a system of atoms requires heavy quantum mechanical calculations, which are not computationally feasible in large-scale simulations. Interatomic potentials are computationally more efficient analytical functions used for calculating the potential energy of a system of atoms, allowing simulations of larger systems or longer time scales than in quantum mechanical simulations. The interatomic potential functions must be fitted to known properties of the material the potential describes. Developing a potential for a specific material typically involves fitting a number of parameters included in the functional form, against a database of important material properties, such as cohesive, structural, and elastic properties of the relevant crystal structures. In the Tersoff-Albe formalism, the fitting is performed with a coordination-based approach, where structures in a wide range of coordination numbers are used in the fitting process. Including many differently coordinated structures in the fitting database is important to get good transferability to structures not considered in the fitting process. In this thesis, we review different types of widely used interatomic potentials, and develop an iron-oxygen potential in the Tersoff-Albe formalism. We discuss the strengths and weaknesses of the developed potential, as well the challenges faced in the fitting process. The potential was showed to successfully predict the energetics of various oxygen-vacancy defect clusters in iron, and the basic properties of the common iron oxide wüstite. The potential might therefore mainly be applicable to atomistic simulations involving oxygen-based defects in solid iron, such as irradiation or diffusion simulations.
  • Pettersson, Annette (2019)
    Paralytic shellfish poisoning toxins belong to a group of marine biotoxins that can cause severe food poisoning. The marine biotoxins have highly varying properties, such as molecular weight, solubility and toxicity. They accumulate into shellfish during harmful algal blooms. The global fish industry monitors sea food prior to releasing them to the market to ensure the safety of consumers, and permitted levels of marine biotoxins are regulated worldwide. Possible bioterrorism use of marine biotoxins is a concern to the governments due to their high toxicity and availability. The main emphasis in this thesis is on paralytic shellfish poisoning toxins, saxitoxin and its analogues. Saxitoxin is listed under the Chemical Weapons Convention. The paralytic shellfish poisoning toxin analogues share physico-chemical properties such as solubility, but they differ highly from each other in toxicity. The most toxic analogues of paralytic shellfish poisoning toxins are the Saxitoxin and Neosaxitoxin. The toxicity of the paralytic shellfish poisoning toxins is due to the blocking of the voltage gated sodium channels, which is a reversible process. The symptoms of paralytic shellfish poisoning can be numbness, weakness and even paralysis, which can lead to respiratory failure. The paralytic shellfish poisoning can be lethal and there is no antidote. A new liquid chromatography- tandem mass spectrometric method using multiple reaction monitoring was developed. Several different hydrophilic interaction liquid chromatography type analysis columns were compared. A liquid chromatography- high resolution mass spectrometric analysis methods using full scan and product ion scan were developed for several of the paralytic shellfish poisoning toxin analogues. The limit of detection and limit of quantification for Saxitoxin analyzed with the high resolution mass spectrometry method were 0.2 ng/ml and 0.7 ng/ml respectively. In average the detection and quantification limits obtained with the high resolution mass spectrometry were around ten times better than the values obtained using triple quadrupole mass spectrometry. New sample preparation methods were developed for four different matrices (mussel, urine, milk and juice) applying solid phase extraction as the primary sample clean-up technique to be used in proficiency tests. Proficiency test samples analyzed during this thesis contained mussel, urine and unknown saxitoxin samples. The proficiency test samples were analyzed using three different chromatography-based analysis techniques to determine the presence of paralytic shellfish poisoning toxins. A comparison was done between the developed triple quadrupole mass spectrometric method, high resolution mass spectrometric method and a standardized fluorescence detection method.
  • Trevisan, Lucrezia (2024)
    Stimuli-responsive polymers have emerged as appealing compounds for the development of high-tech and functional materials. In particular, thermoresponsive polymers have been investigated for a variety of applications. Among these, hydrogel production for additive manufacturing is especially attractive. In fact, hydrogels obtained from synthetic and thermoresponsive polymers can be tailored to obtain biocompatible scaffolds for employment in the biomedical field. Poly(2-oxazolines) and poly(2-oxazines) stand out as promising starting materials for the production of novel hydrogels. In this work, a thermoresponsive and amphiphilic triblock copolymer composed of 2-methyl-2-oxazoline and 2-phenyl-2-oxazine was investigated to determine whether a suitable candidate for biofabrication purposes could be obtained. The copolymer was firstly synthetised, before partial hydrolysis and post-polymerization modification could be carried out. These further manipulations allowed to alter the substituent in position 2 of the 2-methyl-2-oxazoline unit and yield crosslinkable units. The mechanical properties of the triblock were investigated with numerous rheological studies before 3D printing and crosslinking were performed. Crosslinked hydrogels were obtained by using a photoinitiator (Irgacure 2959) and UV radiation. Lastly, swelling behaviour was investigated to determine the capacity of the hydrogels to absorb water and test their durability over time. Overall, this study provided results on specific conditions and parameters required for the fabrication of chemically crosslinked hydrogels, that can be optimised in the future to produce functional materials for additive manufacturing applications.
  • Foreback, Benjamin (2018)
    This project has aimed to investigate and propose improvements to the methods used in the System for Integrated ModeLing of Atmospheric coMposition (SILAM) model for simulating biogenic volatile organic compound (BVOC) emissions. The goal is to study an option in SILAM to use the Model for Emission of Gases and Aerosols in Nature, Version 3 (MEGAN3) as an alternative to SILAM’s existing BVOC calculation algorithm, which is a more simplified approach. SILAM is an atmospheric chemical transport, dispersion, and deposition modelling system owned and continuously developed by the Finnish Meteorological Institute (FMI). The model’s most well-known use is in forecasting air quality in Europe and southeast Asia. Although traffic and other urban emissions are important when modelling air quality, accurate modelling of biogenic emissions is also very important when developing a comprehensive, high-quality regional and sub-regional scale model. One of the motivations of this project is that if BVOC emission simulation in SILAM were improved, the improvements would be passed into subsequent atmospheric chemistry algorithms which form the molecules responsible to produce secondary organic aerosols (SOA). SOA have significant impacts on local and regional weather, climate, and air quality. The development in this project will therefore offer the potential for future improvement of air quality forecasting in the SILAM model. Because SILAM requires meteorological forecast as input boundary conditions, this study used output generated by the Environment-High Resolution Limited Area Model (Enviro-HIRLAM), developed by the HIRLAM Consortium in collaboration with universities in Denmark, Finland, the Baltic States, Ukraine, Russia, Turkey, Kazakhstan, and Spain. Enviro-HIRLAM includes multiple aerosol modes, which account for the effects of aerosols in the meteorological forecast. Running SILAM with and without the aerosol effects included in the Enviro-HIRLAM meteorological output showed that aerosols likely caused a minor decrease in BVOC emission rate. This project has focused on the boreal forest of Hyytiälä, southern Finland, the site of the Station for Measuring Ecosystem-Atmosphere Relations - II (SMEAR-II, 61.847°N, 24.294°E) during a one day trial on July 14, 2010. After performing a test run over the Hyytiälä region in July 2010 for analysis, it was found that SILAM significantly underestimates BVOC emission rates of both isoprene and monoterpene, likely because of an oversimplified approach used in the model. The current approach in SILAM, called ‘Guenther Modified’, uses only a few equations from MEGAN and can be classified as a strongly simplified MEGAN version, with selected assumptions. It references a land cover classification map and lookup table, taking into account only three parameters (air temperature, month, and solar radiation) when performing the calculations. It does not take into account several other important parameters, which affect the BVOC emission rates. Based on qualitative analysis, this appears to be a simplified but limited approach. Therefore, based on these findings, the next step to improve SILAM simulations is to propose a full implementation of MEGAN as a replacement to the current logic in SILAM, which is to use land classification and a lookup table for BVOC emission estimates. MEGAN, which is a much more comprehensive model for simulating BVOC emissions from terrestrial ecosystems. MEGAN includes additional input parameters, such as Leaf Area Index (LAI), relative humidity, CO2 concentration, land cover, soil moisture, soil type, and canopy height. Furthermore, this study found that in the future, simulations involving BVOCs could also potentially be improved in SILAM by adding modern schemes for chemical reactions and SOA formation in future development of SILAM. After gaining in-depth understanding of the strengths and limitations of BVOC in the SILAM model, as practical result, some recommendations for improvements to the model are proposed.
  • Alanen, Osku (2017)
    Prostate cancer remains one of the most frequently diagnosed cancers in men. While its localized form is typically slowly advancing, the aggressive and metastasized forms are responsible for a significant amount of deaths in men in developed countries. Thus, more reliable methods of diagnosis are currently highly sought-after. Prostate Specific Membrane Antigen (PSMA) remains a highly-researched receptor of choice, which has been found to be overexpressed in majority of prostate cancers. Several PSMA-targeting inhibitors with suitable radioisotopes are already being utilized for PET imaging (18F, 68Ga) and treatment (177Lu) of prostate cancer, with new radiotracers possessing improved characteristics being highly sought-after. Fluoroglycosylation is a typical method of altering the properties of radiotracers, leading into more desirable tracer characteristics, such as increased renal excretion due to the compound’s increased hydrophilicity. This can be achieved with chemical reactions, such as oxime formation, where the molecule is conjugated with a fluorine-containing carbohydrate. The aim of this study was to develop two 18F-labeled PSMA inhibitors via oxime formation by utilizing two 18F-labeled carbohydrates: 5-[18F]fluoro-5-deoxyribose ([18F]9) and 2-[18F]fluoro-2-deoxy-D-glucose ([18F]12). The precursor 3 moiety was successfully synthesized by utilizing an amide coupling reaction (yield 56%), followed by acid-catalyzed deprotection. The purification of precursor 3 was achieved by high-performance liquid chromatography (HPLC) with a yield of 29%. The precursor moiety was conjugated with [18F]9 and [18F]12 via oxime formation to yield compounds [19F]4 (yield 39%) and [19F]5 (yield 39%), respectively. This was followed by the synthesis of their respective radioisotopes, [18F]4 and [18F]5. [18F]9 exhibited more favorable labeling characteristics with precursor 3 compared to those of [18F]12, likely due to its readily-available aldehydic form, and milder reaction conditions. Conjugation of [18F]9 with precursor 3 moiety was successfully achieved in 15 minutes at room temperature in the presence of 0.3 M anilinium acetate buffer, with a radiolabeling yield up to 91% (1.5 mM peptide concentration). Comparably, conjugation with [18F]12 was achieved in 30 minutes at 85 ℃ in the presence of aniline, with a radiolabeling yield of 57% (9.8 mM peptide concentration). Minor by-product formation was also evident with [18F]5 while the reaction appeared more specific with [18F]4. Purification of [18F]4 was achieved by HPLC, yielding the radiotracer with 98% radiochemical purity. Similarly, purification of [18F]5 was demonstrated with HLPC using a smaller batch, yielding the product with a radiochemical purity of 88%. Minor degradation of the oxime ether bond into free [18F]9 or [18F]12 was evident as a function of time in an acidic environment, especially with [18F]5. The lipophilicity of the compounds was also demonstrated by the shake-flask method. Both compounds were found to be highly hydrophilic, with LogD7.4 values of –2.8±0.3 and -3.1±0.2 for [18F]5 and [18F]4, respectively. Further experiments should be made to optimize the radiosynthesis protocols for higher activities, and to determine the minimum peptide concentration and reaction time needed for the oxime ether formation. Additionally, the molar activities of the compounds should be determined. Also, the IC50 inhibition potency of PSMA with [18F]4 should be evaluated prior to any in vivo trials to better evaluate its potential as a possible PSMA inhibitor.
  • Häppölä, Niko (2024)
    Introduction: EU medical device regulation (MDR) sets requirements for medical device software (MDSW) development. Following international standards, such as IEC 62304 and IEC 82304-1, is considered best practice to ensure compliance with regulation. At first glance, MDR and standards seem counter-intuitive to the DevOps approach. DevOps has been successful in regular software development, and it could improve MDSW development. In addition, standalone software is more prevalent as a medical device and as software does not need to be embedded into a physical device, the DevOps approach should be more feasible. Methods: In this thesis, a systematic approach of multivocal literature review was conducted. The goal is to find the state-of-the-art of DevOps in MDSW development, what DevOps techniques and practices are suggested by academic literature and industry experiences, and what the challenges and benefits of DevOps are in MDSW. 18 scientific articles and 10 sources of gray literature were analyzed. Results: The DevOps benefits of improved quality and faster release cycle can be achieved up to a certain point. Regulations prevent Continuous Deployment, but Continuous Integration (CI) and Continuous Delivery (CD) are possible. The most promising improvements can be made by automated documentation creation and bringing tasks of regulatory experts and developers closer together by streamlining the regulatory process. Existing DevOps tools can be extended to support compliance requirements. Third-party platforms and AI/ML solutions remain problematic due to regulations.
  • Sokkanen, Joel (2023)
    DevOps software development methodologies have steadily gained ground over the past 15 years. Properly implemented DevOps enables the software to be integrated and deployed at a rapid pace. The implementation of DevOps practices create pressure for software testing. In the world of fast-paced integrations and deployments, software testing must perform its quality assurance function quickly and efficiently. The goal of this thesis was to identify the most relevant DevOps software testing practices and their impact on software testing. Software testing in general is a widely studied topic. This thesis looks into the recent develop- ments of software testing in DevOps. The primary sources of this study consist of 15 academic papers, which were collected with the systematic literature review source collection methodolo- gies. The study combines both systematic literature review and rapid review methodologies. The DevOps software testing practices associated with high level of automation, continuous testing and DevOps culture adoption stood out in the results. These were followed by the practices highlighting the need for flexible and versatile test tooling and test infrastructures. DevOps adoption requires the team composition and responsibilities to be carefully planned. The selected testing practices should be carefully chosen. Software testing should be primarily organized in highly automated DevOps pipelines. Manual testing should be utilized to validate the results of the automatic tests. Continuous testing, multiple testing levels and versatile test tooling should be utilized. Integration and regression testing should be run on all code changes. Application monitoring and the collection of telemetry data should be utilized to improve the tests.
  • Juva, Katriina (2016)
    The temperature and the salinity fields (i.e. the hydrography) of the Baltic Sea determine the density and hence the stratification and density depended circulation of the sea. These features are affected by the changes in the hydrologic circulation, most importantly by the changes in the atmospheric circulation and in the water exchange with the North Sea. The aims of this thesis are to study the hydrographical conditions and changes for the period 1971 - 2007 of the surface and bottom layers of the Baltic Sea and the model sensitivity to number of variables. The surface layer is well studied, but on the whole Baltic Sea scale, the bottom layer studies are rare in number. The halocline and thermocline depths are also included, since they provide information about the mixing. By combining the information from the surface and the bottom, the overview for the whole hydrographical state is provided. For the analysis, three hindcast simulations based on the three-dimensional North-Baltic Sea model are used. The simulations differ in the number of vertical layers, initial conditions and the strength of the bottom drag coefficient. The results show that the vertical stratification is weaker in model than what is observed in in-situ measurements. The simulations differ remarkably in the salinity level and in its evolution. On average, the salinity is decreasing 0.1 - 0.4 ppt per decade except on the deepest parts of the Baltic Proper. The temperature is increasing at the surface and above the permanent halocline on average 0.2 - 0.4 degree Celsius per decade. Large regional differences between the west and east coast of the basins were found. The bottom temperature increase up to 1 degree Celsius per decade was found in the eastern coast of the eastern Gotland Basin, whereas on the Swedish coast the changes are more moderate and during some months, opposite. On the opposite site of the Bothnian Sea and the Gotland Basin, monthly anomalies up to degree Celsius were found for autumn months. In the deeper layers, the temperature decreases 0.2 - 0.4 degree Celsius per decade. The study showed that the Baltic Sea is undergoing a rapid change. In order to get a more detailed view of the changes in stratification and circulation, the changes in density should be studied next.
  • Turkkila, Miikka (2018)
    Tämän työn tarkoituksena oli kehittää työkalut verkkokeskustelun auki purkamiseen. Työkalut mahdollistaisivat verkkokeskustelun nopean analysoimisen, jolloin sitä voitaisiin käyttää mm. verkko-oppimisen sosiaalisten ulottuvuuksien ymmärtämiseksi, kun dialogirakennetta verrataan teorioihin sosiaalisista rakenteista ja vuorovaikutuksista. Lisäksi työkaluja voidaan käyttää opetuksen tutkimuksen tukena ja verkkopohjaisen opetuksen kehittämisessä. Teoriataustana toimii tutkimus verkko-oppimisesta ja erityisenä aihealueena toimii tietokoneavusteinen kollaboratiivinen oppiminen. Termin alle jää laaja joukko eri opetustoimintaa, mutta tässä työssä termillä tarkoitetaan verkon välityksellä toteutettua ryhmäkeskustelua, jossa tavoitteena on oppiminen. Materiaalina toimi yhteensä 16 verkkokeskustelua, joissa neljä eri neljän hengen ryhmää keskustelivat neljästä eri kvanttifysiikkaan liittyvästä aiheesta. Tutkimusmenetelmänä käytettiin temaattista analyysiä keskustelun sisällön ja rakenteen varmistamiseksi. Tätä seurasi sosiaalinen verkostoanalyysi keskustelun rakenteen kautta. Tähän käytettiin erityisesti McDonnell at. al (2014) kehittämää tapaa käyttää triadisia, eli kolmikoista muodostuneita, rooleja verkon analysoimiseksi. Analyysejä varten keskustelut taulukoitiin s.e. jokaisesta viestistä merkittiin lähettäjän ja lähetysajan lisäksi kenelle viesti oli suunnattu ja mitä teemoja viesti piti sisällään. Tämän jälkeen kirjoitettiin Python-kieliset skriptit dialogirakenteen visualisoimiseksi ja niissä esiintyneiden roolien laskemiseksi. Tuloksiksi saatiin, että ryhmät keskustelivat tehtävänannon mukaisesti ja että verkkokeskustelun dialogirakenne voidaan esittää graafisesti niin sanottuna asynkronisena temporaalisena verkkona. Lisäksi keskusteluissa esiintyneet roolit voidaan laskea helposti ja esittää ns. lämpökartta-kuvana. Työn tavoitteet toteutuivat ja työssä kirjoitetut Python-skriptit lyhentävät merkittävästi verkkokeskustelun rakenteen analysoimista. Lisäksi tuloksia voidaan mahdollisesti käyttää ymmärtämään ryhmän sisäisiä sosiaalisia rakenteita. Tämä vaatii kuitenkin lisää työtä tässä käytetyn laskentamallin ja teorioiden yhdistämiseksi.
  • Martikainen, Laura (2017)
    Radiation detectors are devices used to detect ionizing radiation. They can be manufactured from different materials for different purposes. Chemical vapour deposition (CVD) diamond detectors are semiconductor radiation detectors manufactured from artificial diamond grown using the CVD method. The physical properties of diamond make diamond detectors fast and radiation hard, and hence they are a favourable option for precise timing measurements in harsh radiation environments. The work presented in this thesis was done as part of a detector upgrade project of the TOTEM experiment at the Large Hadron Collider of CERN, the European Organization for Nuclear Research. The upgrade program includes the development and the building of a timing detector system based on CVD diamond in order to include the capability to perform precise timing measurements of forward protons. A new I-V measurement setup was built for the purpose of quality assurance measurements of diamond crystals before their further processing to timing detectors. When the setup was operated, different problems were observed, including electrical discharging, instabilities of leakage currents and unexpectedly high leakage current levels. The undesired effects disappeared, when the electrical contact used for supplying bias voltage to the measured samples was modified. Results of both quality assurance and measurements for the setup development are presented.
  • Redmond Roche, Benjamin Heikki (2019)
    Significant changes in sea-ice variability have occurred in the northern North Atlantic since the last deglaciation, resulting in global scale shifts in climate. By inferring the dynamic changes of palaeo seaice to past changes in climate, it is possible to predict future changes in response to anthropogenic climate change. Diatoms allow for detailed reconstructions of palaeoceanographic and sea-ice conditions, both qualitatively, using information of species ecologies and quantitatively, via a transfer function based upon diatom species optima and tolerances of the variable to be reconstructed. Three diatom species comprising a large portion of the training set are proxies for the presence of sea ice: Fragilariopsis oceanica, Fragilariopsis reginae-jahniae and Fossula arctica, have currently been grouped into one species – F. oceanica – in the large diatom training set of the northern North Atlantic region. The clustering of the species may result in an imprecise reconstruction of sea ice that does not take into account all the available ecological information. The proportions of the three species were recounted from the original surface sediment slides alongside the additional chrysophyte cyst Archaeomonas sp. and statistically analysed using Canoco and the R software package eHOF. A core from Kangerlussuaq Trough comprising the Late Holocene (~690–1498 Common Era) was also recounted and analysed using C2. The separated diatom species and chrysophyte cyst Archaeomonas sp. exhibited different relationships to both sea-ice concentration (aSIC) and sea surface temperature (aSST). The separated F. oceanica is a ‘cold-mixed’ water species occurring at cold aSST and both low and high aSIC. High abundances occur in the marginal ice zone (MIZ) where surficial meltwater is high during the spring bloom, with additional inputs from glacial meltwaters nearshore. F. reginae-jahniae is a sea-ice associated species related to cold aSST and high aSIC. High abundances occur in the low salinity Arctic Water dominated MIZ which experiences significant aSIC. F. arctica is a sea-ice associated species related to cold aSST and high aSIC. High abundances occur in the low salinity Arctic Water dominated MIZ which experiences high aSIC, particularly in polynya conditions. F. arctica can be considered a characteristic polynya species at high abundances. Archaeomonas sp. is a ‘cold-mixed’ water species related to both cold and relatively warm aSST and low and high aSIC. High abundances occur in both relatively warm ice-free Atlantic Water and also in cold high aSIC Arctic Water conditions rendering it a more complex indicator for aSST or aSIC proxy. However, the aversion to MIZ conditions indicates that Archaeomonas sp. is associated with a relatively saline unstratified water column. This is the first time that the distribution and ecology of Archaeomonas sp. has been presented. As such, the ecology described here can be used in future studies. The separation of the three diatom species is crucial for the ecological interpretation of downcore assemblage changes. It is also crucial for the application of transfer functions in order to have greater precision in reconstructing aSIC and assessing the influence of Arctic Water or Atlantic Water, even at low abundances.
  • Heikkinen, Janne (2020)
    Subarctic ponds are important habitats for many freshwater species. The recent increase in global temperatures have stressed on the study of these habitats as rising water temperatures may have severe consequences to these cold and harsh ecosystems. Despite its importance, this topic has been largely overlooked in scientific research. Diatoms are microscopic, single-celled benthic algae, which are important indicators for environmental quality. Elevation is one of the main environmental variables controlling the composition and richness of diatom species as it shapes communities through several environmental variables such as temperature and water chemistry. The aim of this thesis was to illustrate the variability in diatom species richness and community composition along an elevational gradient in Kilpisjärvi and reveal the most important environmental drivers. As an additional focus, the applicability of the BenthoTorch sampling device was tested in measuring benthic algae biomass. Field and laboratory measurements were done using universal standards. Statistical analyses included multiple univariate and multivariate data analysis techniques. It was found that water pH, aluminium concentration and air temperature explained the variation in species richness and community composition the most. Elevation had only a secondary, non-significant role in shaping the diatom communities in subarctic ponds. Nearby sites showed similar compositions in terms of water chemistry and diatom communities. Biotope characterisation did not provide any further insight into the differences or similarities of diatom community composition or species richness. There were some differences in how genera responded to environmental variables. The centre of distributional range of many taxa was below the mid-point of the elevational gradient but species often occupied the whole elevational gradient. Rare taxa appeared at the ends of the elevational spectrum. The amount of singleton taxa was high (25.8%) and can be expected to increase with climate change. The BenthoTorch did provide reasonable results for benthic algae in the subarctic when compared to previous literature, but further research is required to grasp its full potential. More examination into the relationship between explanatory variables can be suggested (e.g. total phosphorus and ion balance) to gain better understanding on the changes in diatom species richness and community composition along elevational gradients.