Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Title

Sort by: Order: Results:

  • Hynynen, Antti (2023)
    The tear film is a thin liquid layer enveloping the cornea and conjunctiva. It serves as a crucial protective barrier safeguarding the ocular surface from environmental factors. Its functions also include ensuring optimal hydration and maintaining a consistently smooth optical surface. The tear film consists of three discernible layers. The outermost layer is the remarkably thin tear film lipid layer, measured on the nanometer scale, yet crucial for impeding evaporation and maintaining ocular surface homeostasis. The tear film lipid layer forms from meibum: a complex mixture of lipids secreted by Meibomian glands. Meibum mainly consists of lipids from five different classes, with wax esters constituting about 50 % of meibum’s composition. The majority of tear film lipids are branched. Branched lipids are methyl-branched either from the penultimate or antepenultimate carbon, with the former being more common. Due to challenges in obtaining sufficient and uncontaminated meibum samples, there is a high demand for synthetically produced lipids. Pure samples of lipids of varying structures are needed as standards in studies regarding meibum composition. Additionally, lipids and lipid mixtures are required for biophysical studies concerning tear film models and researching the molecular mechanisms associated with ocular surface diseases. Given the demand for various tear film lipids, it is surprising that there have been no prior attempts to synthesize branched lipid species or study their effect on lipid assembly, despite their prevalence in meibum. To meet the demand for various tear film lipids, especially branched species, we aimed to design a synthetic approach capable of producing a wide range of such tear film lipid analogs. This method proved successful in synthesizing the most abundant tear film lipid, the iso-branched wax ester C18:1/26:0. The results of this synthesis are presented in this thesis.
  • Vilja, Jaakko (2016)
    Tutkielmassa tarkastellaan Paul Ernestin sosiaalis-konstruktivistista ja Alain Badioun platonistista matematiikanfilosofiaa, sekä joitakin matematiikanpedagogisia ajatuksia, jotka näihin filosofisiin positioihin liittyvät toisaalta Ernestin omissa teksteissä ja toisaalta Badioun ajatusten inspiroimissa pedagogisissa kirjoituksissa. Ernest kannattaa sosiaalis-konstruktivistista näkemystä, jonka mukaan matematiikka on intersubjektiivisesti luotu ja ylläpidetty kielellisen toiminnan alue, ja matemaattinen objektiivisuus tarkoittaa samaa kuin sosiaalinen hyväksyntä. Hän kehittelee matematiikanfilosofiansa perustaa keskeisesti Imre Lakatosin ja Ludvig Wittgensteinin ajatuksien pohjalle. Ernest tulee kuitenkin ottaneeksi kantaa fysikaaliseen todellisuuteen, jonka laatu ja pääsy subjektien yhteispelin osaseksi jää hämäräksi. Lisäksi hänen tapansa kritisoida ns. absolutistista matematiikanfilosofiaa jää ohueksi. Filosofi ja matemaatikko Badiou tarjoaa vaihtoehtoisen, väljästi platonistiseksi määriteltävissä olevan ontologisen kehyksen, jossa olemassaolevainen on määrittelemätöntä moneutta, ja tuon moneuden käsittely on väistämättä matemaattista luonteeltaan. Tällöin fysikaalisen todellisuuden koettu matemaattisuus samastuu matematiikan omaan matemaattisuuteen: molemmissa on kyse olemuksellisesti jäsentymättömän jäsentämisestä. Monet matematiikan pedagogit ovat inspiroituneet Badioun ajatuksista ja kehitelleet pedagogisia ajatuksia hänen ajatustensa inspiroimina. Nuo pedagogiset kannanotot muistuttavat suurelta osin Ernestin vastaavia ajatuksia. Avoimet tehtävät, ongelmakeskeinen opetus ja kriittisen keskustelun tukeminen auttavat synnyttämään matemaattista tapahtumista ja tukevat oppilaiden matemaattisen subjektiviteetin muotoutumista sen sijaan, että opetuksessa keskityttäisiin siirtämään valmiita tietorakenteita sellaisinaan oppilaiden päihin. Tutkielman lopuksi kirjoittaja esittelee mielestään parhaan tavan yhdistää ajattelijoiden erilaisia oppeja, sekä niiden seurauksia matematiikan opettamisen kannalta. Lisäksi mainitaan filosofisia ja ihmistieteellisiä jatkotutkimuksen mahdollisuuksia.
  • Tillder, Eva-Maria (2015)
    Since the release of the Lord of the Rings movie trilogy (2001, 2002, 2003) New Zealand has been visited by millions of people wanting to experience the mythical Middle-earth they saw on the silver screen. The Kiwi director Peter Jackson's interpretation of the epic story of adventure, sacrifice and friendship in the fantasy land created by the English author J.R.R. Tolkien has widely be seen as an 'authentic' and New Zealand as a fair representation of the Middle-earth with its lush valleys, barren wastelands and remote mystical mountains. Film tourism contains any tourist activity that is induced by the viewing of the moving image. It is a relatively new field of study and even though the Lord of the Rings is the most studied topic within the film tourism literature, the on-site tourism experience process is still not fully understood. The research concentrates on the experiences of three different types of Lord of the Rings tours; Hobbiton Movie Set is the only built set left in New Zealand, Nomad Safaris besides the movies also concentrates to the other activities and the rugged landscape around the Queenstown area, whereas Adventure Safari Movie Tours tries to compensate the fact all of its locations are in public parks by re-enactments and movie clips. The main focus of the study is to find out how the tour participants experience their tours; what elements are in present in a film tourism experience, and what qualities make the locations and experiences authentic? As the main source of material is used reviews written on the TripAdvisor website. Triangulation is the most appropriate method to analyse user-generated data as it gives an opportunity to combine qualitative and quantitative methods. Content analysis can be used to analyse the reviews, but because of the subjectivity of experiences it is also relevant to compare the results of the study to the researcher's own experiences and observations from the chosen Lord of the Rings tours. Film tourist experience is a complex research matter as it is highly subjective sum of many parts. The experience starts already in the planning phase when the expectations are formed, and the anticipation builds up during the travel to the attraction. The experience at the attraction itself holds in both the physical and social aspects, and authentic locations and interactions are the key to the satisfying film tourist experience. With the help of the guides' interpretative skills the narratives turn into myths and reality merges with hyperreal fantasy. Memorabilia, such as rocks, t-shirts and pictures gained from the attraction help in the post-trip recollection phase and work as a status symbols. The results indicated that the sample reflected the typical Lord of the Rings tour participant as identified in previous research, with reviewers being mainly part of the working-age population (25-49 year olds), predominantly female, and from the Western and English-speaking markets, majority being from Australia and USA. The biggest negative factor in the film tourist experience is the bad value for the money. Whereas a good tour guide is the most important part in the value creation. Passionate tour guides are even able to turn weaker locations into experiences worth paying. The scenery seems to be the balancing factor, as a bad tour guide combined to the beautiful location seems to lead to an average experience. Hobbiton is seen as the most authentic and iconic of the three studied locations because of its lifelike details and the surrounding pastoral landscape. The set is the actual piece where the outdoor scenes of the Hobbiton where filmed. The Nomad Safaris participants on the other hand immerse into the beauty of the natural landscape and are not even missing any set structures.
  • Daubaris, Paulius (2021)
    Designing software for a variety of execution environments is a difficult task. This is due to a multitude of device-specific features that must be taken into account. Hence, it is often difficult to determine all the available features and produce a single piece of software covering the possible scenarios. Moreover, with varying resources available, monolithic applications are often hardly suitable and require to be modularized while still providing all the necessary features of the original application. By employing units of deployment, such as components, it is possible to retrieve required functionality on-demand, thus adapting to the environment. Adaptivity has been identified as one of the main enablers that allow leveraging offered capabilities while reducing the complexity related to software development. In this thesis, we produced a proof-of-concept (PoC) implementation leveraging WebAssembly modules to assemble applications and adapt to a particular execution environment. Adaptation is driven by the information contained in metadata files. Modules are retrieved on-demand from one or more repositories based on the characteristics of the environment and integrated during execution using dynamic linking capabilities. We evaluate the work by considering what is the impact of modular WebAssembly applications and compare them to standard monolithic WebAssembly applications. In particular, we investigate startup time, application execution time, and overhead introduced by the implementation. Finally, we examine the limitations of both, the used technology and the implementation, and provide ideas for future work.
  • Rättö, Ronja (2021)
    In this master’s thesis biochemistry of cancer, glucose transporters in cancer therapy and the importance of imaging are explored, and the synthesis of applicable glycoconjugates is introduced. Bioorthogonal applications of functionalizable glycoconjugates, such as, drug delivery, imaging, cancer therapy and studying biological phenomenon inside living cells, are reviewed. The basic theories on biochemistry of cancer, cancer imaging with [18F]-2-fluoro-2- deoxy-D-glucose-Positron Emission Tomography (18FDG-PET) and glucose transporters are discussed in the literature review section. The databases used to scan relevant literature were SciFinder and Helsinki University Library (Helka) web article search. The first of the glycoconjugates synthesized, 1,2,3,4-Tetra-O-benzyl-6-O-propargyl-D- glucopyranose, is already published in the literature as an intermediate on the route towards improved delivery agents for BNCT. The final product has been assessed in in vitro studies, cellular uptake and cytotoxicity studies giving outstanding results. The boron delivery capacity of the molecule is prominently superior compared to the agents in clinical use at the moment. The glucoconjugate can also be functionalized for the use in other applications with the use of the propargyl conjugation site. The second synthesis covers two potential bioorthogonal chemistry glycoconjugates: α-D-Mannopyranoside, phenyl 6-azido-6-deoxy-3,4-O-[(1S,2S)-1,2- dimethoxy-1,2-dimethyl-1,2-ethanediyl]-1-thio-, 2-trifluoromethanesulfonate and -2-(4- methylbenzenesulfonate). These mannopyranoside derivatives are functionalizable glycoconjugates with a wide range of applications in biological research. They both contain three conjugation sites that can be functionalized for further applications. The carbohydrate part is recognized by cells, the azide functional group at C-6 can be modified through click chemistry, the thiol can be activated in glycosylation reactions and the triflate or tosylate in C-2 can be substituted in an SN2 reaction, for example with [18F] to enable imaging. This elegant toolkit offers a wide range of biorthogonal opportunities for chemists and biologists alike.
  • Kuisma, Salla (2017)
    Changes in the transport environment and the resulting need to manage transport demand require a better understanding of travel behaviour. The concept of mobility is defined as the potential for movement, and is well suited to this purpose. However, despite the definition, mobility has usually been measured in terms of realized travel indicators (descriptive data of past travel), because potential for movement is hard to capture. To make mobility easier to approach in practice, this work elaborates the mobility concept, describes a conceptual model for it, and implements a more comprehensive approach to it through a survey. The work was done in two parts: 1) The multidisciplinary literature was reviewed, and three specialists were interviewed to construct a conceptual mobility model that specifies the relevant factors comprising mobility. 2) A survey on daily travel was conducted in five Finnish cities. The survey focused on three issues of mobility: personal travel preferences (in terms of features), resources and experienced constraints. The results show that mobility is an amalgam of personal variables (background, life situation, personality, identity, preferences, needs, resources and routines), situation-specific and environment-related factors, decision-making processes, and realized travel. These are specified in the mobility model. The survey results show that when respondents were asked to rate sixteen travel features in terms of importance, those valued the highest on average were reliability, rapidity and freedom from transport timetables. The features were valued differently depending on the trip. On a grocery store trip, for example, reliability was not as important as on work or leisure activity trips, whereas boot space for goods was considered essential. Active users of cars, public transport and bicycles had different priorities than their non-active counterparts. Car drivers appreciated the rapidity, reliability, freedom from transport timetables, possibility to drive, avoiding walking, convenient boot space, privacy, and avoiding changing vehicles and going outdoors in bad weather, afforded by their vehicle. Users of public transport valued its environmental friendliness, low cost, possibility to focus elsewhere than on driving, and physical exercise. Understanding personal preferences has the potential to contribute, among other things, to smarter demand management. The results also show that over 90% of the respondents experienced some of the six defined constraints on their daily mobility: lack of time, lack of money, low energy or difficulty coping, safety concerns, lack of a suitable vehicle, or physical disability. Low energy or difficulty coping was the most common constraint, with 82% experiencing it at least slightly and 34% quite a lot or very much. The respective figures for lack of time, which was the second most common constraint, were 65% and 32%. The constraints were related to personal variables, which supports earlier findings. The results indicate that the personal-resource perspective can increase our understanding of mobility. In particular, the mental resources needed for travel seem to be a relevant issue in mobility that is rarely considered and therefore requires greater attention.
  • Islam, Md. Mesbahul (2016)
    Ensuring quality and reliable wireless network in crowd events, such as sport events in the stadium or exhibitions in large convention halls, has been challenging. Standard 802.11 Wi-Fi is based on CSMA (Carrier sense multiple access) technology; however using TDMA (Time Division Multiple Access) instead might bring benefits such as energy savings, fair use of bandwidth and more predictable performance in crowded networks. Using TDMA with standard Wi-Fi hardware has been proposed before, but the proposed solutions would require changes to the MAC layer, which means modifying NIC drivers and even firmware. The goal of this thesis is to investigate, whether a simple TDMA could be implemented on the application layer for crowded Wi-Fi networks. 'Application Layer TDMA' means that the network stack would be left unchanged, and the applications on different devices would agree on a schedule of sending packets on their dedicated time slots. The experiment of the thesis has been carried out in an RF shielded laboratory in Helsinki University and from the experiment we have evaluated the performance of the application layer TDMA and compared with CSMA/CA.
  • Järvinen, Juiju (2024)
    Positroniemissiotomografia (PET) on yksi suosituimmista kuvantamistekniikoista kehon metabolian ja kemiallisten muutosten kuvantamiseen nykypäivänä. PET-kuvantaminen edellyttää radiomerkkiainetta, ja fluori-18 on nykyään käytetyin radioisotooppi. Bio ortogonaaliset ja click-reaktiot ovat saaneet paljon huomiota uusien radiomerkkiaineiden valmistuksessa. Bio-ortogonaalisia reaktioita käytetään reaktioiden kohdentamiseen, jotta reaktiot tapahtuisivat nopeasti biologisissa olosuhteissa, esimerkiksi, tetrasiini-trans syklookteeni-reaktiot. Click-reaktiot ovat hyödyllisiä radioleimaamisessa, sillä lyhyet puoliintumisajat edellyttävät nopeita reaktiota. Esimerkiksi, rikki(VI)-fluoridi vaihto (SuFEx) reaktiot olisivat tehokkaita tähän tarkoitukseen. Tämän työn tavoitteena oli luoda uusi tetrasiini SuFEx-ryhmällä, jota voidaan radioleimata fluori-18:lla. Alifaattinen osuus halutusta molekyylistä tuotettiin korkealla saannolla. Tetrasiinin saanto oli alhainen, mutta odotettavissa kirjallisuuden perusteella. Suunniteltu bromi-linkkeri yhdistettiin tetrasiinin amiiniryhmään onnistuneesti. Kuitenkaan saatu yhdiste ei reagoinut hyvin alifaattisen osan kanssa. Tehtiin useita koereaktioita, kuten erilaisten linkkerien käyttö tetrasiinin ja alifaattisen osan välillä, pienemmän kaupallisesti saatavilla olleen alifaattisen osan käyttö ja SuFEx-ryhmän lisääminen suoraan tetrasiiniin. Jotkut näistä kokeista onnistuivat, mutta viimeistä synteesin tuotetta ei voitu analysoida, joten sitä ei voitu radioleimata.
  • Rodriguez Beltran, Sebastian (2024)
    DP-SGD (Differentially Private Stochastic Gradient Descent) is the gold standard approach for implementing privacy in deep learning settings. DP-SGD achieves this by clipping the gradients during training and then injecting noise. The algorithm aims to limit the impact of any data point from the training dataset on the model. Therefore, the gradient of an individual sample will give information until a certain point, thus limiting the chances of an inference attack discovering the data used for the model training. While DP-SGD ensures the privacy of the model, there is no free lunch, and it has its downsides in terms of the utility-privacy trade-off and an increase in computational resources for training. This thesis aims to evaluate different DP-SGD implementations in terms of performance and computational efficiency. We will compare the use of optimized clipping algorithms, different GPU processing unit architectures, speed-up by compilation, lower precision in the data representation, and distributed training. These strategies effectively reduce the computational cost of adding privacy to the deep learning training compared to the non-private baseline.
  • Sirokov, Roman (2014)
    Processing data produced by next-generation sequencing technologies is a computationally intensive task. We aim to speed up this task by means of parallel computing. Our paralel computing solution employs Slurm for managing workload between different nodes. It can be used on top of Anduril, a workflow management software for scientific data analysis, as well as on its own. To test the performance of our solution, we use a workflow for post-processing and analyzing RNA-Seq data that originates from lymphoma patients. Data consists of 447 samples independent from each other that can be processed in parallel. To evaluate the performance we employ three different metrics: a level of parallelization, execution time and CPU load. The workflow achieved an excellent level of parallelization for the provided data of 447 samples with the upper bound of 894 cores. Execution times were compared in two different manners: with a set of homogenous samples of various sizes and heterogenous samples. Homogenous samples took on average the similar amount of time regardless of the size of the set. With heterogenous sets the execution time of the largest sample was chosen as a reference and the total execution time was 32% longer than the baseline. Finally, the CPU load of each component was measured. With homogenous sets high CPU load was observed, while with heterogenous sets CPU idling was detected.
  • Husiev, Yurii (2019)
    The aim of the research was to explore the new possibilities of indoles activation via photoredox catalysis. The interest in general was focused on synthesis of related biaryls through C(sp2)-C(sp2) radical mediated cross-coupling reactions. Conducted literature overview revealed that photochemical methods are undergoing rapid development, being highly promising tools in a way to our goals. As a result, it was discovered that 3-bromoindoles may interact with acridinium and iridium based photocatalysts, producing free radicals that can be coupled with aryl of interest. The further method development helped to optimize the reaction condition to achieve good to excellent biaryls yield. In addition, new ground-up synthetic routes toward several indoles, their derivatives and one of Fukuzumi catalyst were disclosed and supplemented by spectral data. The obtained results might also be useful for developing more complicated dual catalysis systems.
  • Valdes Portas, Patricia (2024)
    The Fukushima-Daiichi Nuclear Power Plant (FNDPP) accident on March 11, 2011, resulted in the release of radioactive cesium-rich microparticles (CsMPs), which can travel long distances thanks to their small size and light weight. Since the long-term radiobiological health-effects and accumulated radiation dose of inhaled CsMPs remain unknown, this thesis proposes a model for tracking synthetic 44-μm and 2.2-μm borosilicate microparticles, which resemble the SiO2 composition and spherical morphology of CsMPs, under Positron Emission Tomography (PET) by radiolabeling them with positron-emitting radionuclides. The use of 44-μm microparticles was discontinued early on as the size of the 2.2-μm particles was more representative of the more common type A CsMPs (0.1-10 μm). Three different radiolabeling approaches were pursued along this project, two directed at 68Ga-labeling, and a third one at 18Ffluorination. The first and main approach was based on the surface functionalization of the particles with (3-aminopropyl)triethoxysilane (APTES) and a suitable chelator for the coordination of [68Ga]Ga3+ ions, like 2,2′-(7-(1-carboxy-4-((4-isothiocyanatobenzyl)-amino)-4-oxobutyl)-1,4,7-triazonane-1,4-diyl)diacetic acid (p-NCS-Bn-NODAGA) or desferrioxamine (DFO). The second approach involved surface functionalization with ethanolamine, polyethylene glycol (PEG) and DFO. The third approach was based on the natural ability of [18F]F- to substitute silanol groups present on the surface of the borosilicates. Surface functionalization with APTES was confirmed using X-ray photoelectron spectroscopy (XPS), zeta potential and elemental analysis, as opposed to functionalization with PEG-ethanolamine. Scanning electron microscopy (SEM) images showed no significant morphological alterations upon functionalization. 68Ga-labeling of the NODAGA-APTES functionalized 2.2-μm borosilicates was achieved with a mean radiochemical yield (RCY) and radiochemical purity (RCP) of 65 ± 5% and 94 ± 2%, respectively. 68Ga-labeling of DFO-APTES and DFO-PEG-ethanolamine functionalized 2.2-μm borosilicates was not successful (RCY below 15% and RCP of about 50%). 18F-fluorination was not successful due to the high tendency of [18F]F-Si bonds to undergo hydrolysis in aqueous media. The stability of the final [68Ga]Ga-NODAGA-APTES product over a 0-3 hour time period was higher than 90% in five different simulated physiological conditions. The results of this project serve as a promising prospect for the design of radiotracers resembling CsMPs for PET tracking upon in vivo administration.
  • Moroz, Anton (2022)
    Software development industry has been revolutionized through adoption of software develop- ment methods such as DevOps. While adopting DevOps can speed up development through collaborative culture between development and operations teams, speed-driven adoption can have an adverse impact on security aspects. DevSecOps is a concept that focuses on embed- ding security culture and activities into DevOps. Another contributing factor to the more agile development landscape is the widespread adoption of open source components. However, the risk of putting too much trust into the open source ecosystem has resulted in a whole new set of security issues that have not yet been adequately addressed by the industry. This thesis is commissioned by Neste Corporation. The company has set an initiative to in- corporate methods that enable better transparency, agility, and security into their software development projects. This thesis collects research data on secure software development prac- tices by combining findings of a literature review with a case study. The qualitative case study is done by interviewing eight stakeholders from four different software development teams. The literature review shows that securing software is very much an ongoing effort, especially in the open source ecosystem. Therefore, it might be not surprising that the results from the case study revealed multiple shortcomings on the subject matter despite obvious efforts from the participating teams. As a result, this thesis presents potential ideas for the case company to consider integrating into their software development projects in order to kickstart their secure software development journey.
  • Righi, Cecilia (2024)
    It is nowadays recognized the importance of investigating mechanisms and processes related to aerosol particles, including those governing to their formation, with condensable vapors acknowledged as key precursors. Atmospheric pressure chemical ionization mass spectrometry has demonstrated exceptional capabilities for in-situ quantitative measurements of these vapors; hence, such analytical technique has been extensively applied for this purpose. Consequently, there is a growing need for measurement guidelines to ensure the comparability of data across the different studies. The study outlined in this thesis was aimed at contributing to the establishment of best practices for calibrating atmospheric pressure chemical ionization mass spectrometers for the measurement of gaseous sulfuric acid and for evaluating the relative detection limit of the instrument. These objectives were pursued through a systematic work on the calibration setup and procedure and on the assessment of the background signal for the system, which is needed to compute the associated detection limit.
  • Viitaja, Tuomo (2019)
    Concerns about the state of the environment and the global climate change has created a need for more efficient and greener ways to produce chemicals and fuels. One solution to these challenges is to find improved ways of utilizing biomass. This thesis deals with the valorization of pectin rich biomasses. Bioengineering yeast to express the alternative galacturonic acid catabolism pathway opens up an opportunity to use these underutilized biomasses in a more efficient way. In order to bioengineer yeast, understanding of the metabolic pathways and the enzymes functioning on these pathways is required. In order to map out certain degradation steps, access to non-commercial compounds and structural analogues is a necessity. The aim of this thesis was to develop a synthetic route to D-tagaturonate, and its structural analogues, which are intermediates on the galacturonic acid catabolism pathway. The chosen multistep synthetic route to D-tagaturonate proved to be challenging. The end product was not obtained, however, the laboratory work showed that the synthetic route is feasible after some minor adjustments. On a general level, new information on the limitations of widely utilized protective groups could be uncovered. These findings will help to optimize the synthetic route to D-tagaturonate. In addition, these findings show that there is still room for improvement in orthogonal protective group strategies applicable to the synthesis of complex organic molecules such as carbohydrates.
  • Kuusisto, Teemu (2015)
    The objective of this thesis was text-based prediction of phrasal prominence. Improving natural sounding speech synthesis motivated the task, because phrasal prominence, which depicts the relative saliency of words within a phrase, is a natural part of spoken language. Following the majority of previous research, prominence is predicted on binary level derived from a symbolic representation of pitch movements. In practice, new classifiers and new models from different fields of natural language processing were explored. Applicability of spatial and graph-based language models was tested by proposing such features as word vectors, a high-dimensional vector-space representation, and DegExt, a keyword weighting method. Support vector machines (SVMs) were used due to their widespread suitability to supervised classification tasks with high-dimensional continuous-valued input. Linear inner product and non-linear radial basis function (RBF) were used as kernels. Furthermore, hidden Markov support vector machines (HM-SVMs) were evaluated to investigate benefits of sequential classification. The experiments on the widely used Boston University Radio News Corpus (BURNC) were successful in two major ways: Firstly, the non-linear support vector machine along with the best performing features achieved similar performance than the previous state-of-the-art approach reported by Rangarajan et al. [RNB06]. Secondly, newly proposed features based on word vectors moderately outperformed part-of-speech tags, which has been inevitably the best performing feature throughout the research of text-based prominence prediction.
  • Stubb, Henrik (2021)
    Boron neutron capture therapy (BNCT) is an emerging cancer treatment method that is currently limited by suboptimal boron delivery strategies. A wide range of biomolecules have been investigated as potential tumor targeting boron carriers. Carbohydrates excel on several aspects by providing high solubility and low cytotoxicity. This work focuses on the synthesis of an orthogonally protected glycoconjugate for a GLUT1 targeting approach to BNCT. Knowledge of suitable glycoconjugate structures for this targeting strategy is emerging, and the aim of this work was to explore further functionalization of these structures, which will be necessary for future labeling of the boron carriers. The seven-step synthesis of an orthogonally protected mannopyranoside – and structural determination with NMR spectroscopy and mass spectrometry – is described in detail. The attachment of a boron cluster to the target molecule was performed, and early-stage fluorination trials were carried out. While important insights were obtained from these functionalization attempts, the functionalization of the target molecule will require some additional effort in the future. The protecting group strategies are currently being redesigned based on the information obtained through this work.
  • Zubair, Maria (2022)
    The growing popularity of the Internet of Things (IoT) has massively increased the volume of data available for analysis. This data can be used to get detailed and precise insights about users, products, and organizations. Traditionally, organizations collect and process this data separately, which is a slow process and requires significant resources. Over the past decade, data sharing has become a popular trend, where several organizations have engaged in sharing their collected data with other organizations and processing it together for analysis. Digital marketplaces are developed to facilitate this data sharing. These marketplaces connect producers and consumers of data while ensuring that the data can be shared inside and outside the organization seamlessly and securely. This is achieved by implementing a fine-grained and efficient data access control method that restricts access to the data for authorized parties only. The data generated by IoT devices is voluminous, continuous, and heterogeneous. Therefore, traditional access control methods are no longer suitable for managing access to this data in a digital marketplace. IoT data requires an access control model, which can handle large volumes of streaming data, and provides full control transparency of data access to IoT device owners. In this thesis, we have designed and implemented a novel access control mechanism for a data distribution system developed by Nokia Bell Labs. We have outlined the requirements for designing an access control system to manage data access for data shared across multiple heterogeneous organizations. We have evaluated the proposed system to assess the feasibility and performance of the system in various scenarios. The thesis also discusses the strengths and limitations of the proposed system and highlights future research perspectives in this domain. We expect this thesis to be helpful for researchers studying IoT data processing, access control methods for streaming (big) data, and digital marketplaces.
  • Hyvönen, Juuso (2015)
    Software companies have problems connecting business goals to actual software development tasks. This means that a lot of software development is done without linkage to business goals, which can lead to wasting time and money, and to bloated- and hard to maintain software and failed software projects. Roadmapping is a popular method to communicate future product development efforts but a problem with roadmaps is that they usually do not communicate the value the roadmapped future work is expected to create. This thesis presents an action oriented case study on three software planning techniques that fit into a lean software product development organization. The case study is about benefits of using Lean canvas, Impact mapping and Lego serious play as tools for value-oriented product development planning. Lean canvas and Impact mapping are promising techniques for helping a company connect business goals to actual software development tasks. They therefore aid in avoiding waste and creating value through putting effort only on the development tasks that create value. Lego serious play is an strategic planning method that utilizes Lego bricks and the knowledge of the whole development team to improve decision making and building shared knowledge. Results from the study show that Lean canvas proved to fulfil its promises to be a lightweight technique that improves shared understanding of the business model. Lego serious play has a similar benefit of improving shared understanding in the team, but it is more geared towards visualizing problems and finding solutions for them. Impact mapping was proved to be an effective way to find value creating tasks and to visualize the value assumptions behind each task. Based on these findings, an approach for value-oriented roadmapping is sketched.
  • Wahlman, Pyry (2012)
    In this thesis we give a self-sufficient introduction to the trace anomaly and its applications in the problem of cosmological constant. We begin by revising the renormalization of quantum electrodynamics in flat space and the Lagrangian formalism of general relativity. Then we discuss shortly about the renormalizability of quantum general relativity, after which we turn our attention to a semiclassical theory of quantum gravitation. We review the construction and renormalization of the semiclassical theory, and discuss shortly the stability of it. We then proceed to examine the trace anomaly of the semiclassical theory, and begin by reviewing Weyl cohomology in n-dimensions. We use the Weyl cohomology to construct the Wess-Zumino action, from which we derive a non-local action for the trace anomaly. The non-local action is then rendered local by introducing new auxiliary fields, in which the non-local behaviour of the action is contained. After all these theoretical considerations we finally examine the non-trivial cosmological consequences of the trace anomaly. At first we review shortly the Friedman-Robertson-Walker -model and its classical perturbations, after which we examine the linear perturbations of the trace anomaly action in de Sitter space. We find that when the auxiliary fields of the action are quantized, the cosmological constant becomes dependent on the border conditions at the horizon scale of de Sitter space. We then conclude that the small but non-zero value of the cosmological constant could be a physical consequence of the presence of the horizon.