Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Title

Sort by: Order: Results:

  • Lindgren, Eveliina (2015)
    An experiment-driven approach to software product and service development is getting increasing attention as a way to channel limited resources to the efficient creation of customer value. In this approach, software functionalities are developed incrementally and validated in continuous experiments with stakeholders such as customers and users. The experiments provide factual feedback for guiding subsequent development. Although case studies on experimentation conventions in the industry exist, an understanding of the state of the practice is incomplete. Furthermore, the obstacles and success factors of continuous experimentation have been little discussed. To these ends, an interview-based qualitative survey was conducted, exploring the experimentation experiences of ten software development companies. The study found that although the principles of continuous experimentation resonated with industry practitioners, the state of the practice was not mature. In particular, experimentation was rarely systematic and continuous. Key challenges related to changing organizational culture, accelerating development cycle speed, measuring customer value and product success, and securing resources. Success factors included an empowered organizational culture and deep customer and domain knowledge. There was also a good availability of technical tools and competence to support experimentation.
  • Ahlfors, Dennis (2022)
    While the role of IT and computer science in the society is on the rise, interest in computer science education is also on the rise. Research covering study success and study paths is important for understanding both student needs and developing the educational programmes further. Using a data set covering student records from 2010 to 2020, this thesis aims to find key insights and base research in the topic of computer science study success and study paths in the University of Helsinki. Using novel visualizations and descriptive statistics this thesis builds a picture of the evolution of study paths and student success during a 10-year timeframe, providing much needed contextual information to be used as inspiration for future focused research into the phenomena discovered. The visualizations combined with statistical results show that certain student groups seem to have better study success and that there are differences in the study paths chosen by the student groups. It is also shown that the graduation rates from the Bachelor’s Programme in Computer Science are generally low, with some student groups showing higher than average graduation rates. Time from admission to graduation is longer than suggested and the sample study paths provided by the university are not generally followed, leading to the conclusion that the programme structure would need some assessment to better incorporate students with diverse academic backgrounds and differing personal study plans.
  • Barral, Oswald (2013)
    This thesis goes further in the study of implicit indicators used to infer interest in documents for information retrieval tasks. We study the behavior of two different categories of implicit indicators: fixation-derived features (number of fixations, average time of fixations, regression ratio, length of forward saccades), and physiology (pupil dilation, electrodermal activity). Based on the limited number of participants at our disposal we study how these measures react when addressing documents at three different reading rates. Most of the fixation-derived features are reported to differ significantly when reading at different speeds. Furthermore, the ability of pupil size and electrodermal activity to indicate perceived relevance is found intrinsically dependent on speed of reading. That is, when users read at comfortable reading speed, these measures are found to be able to correctly discriminate relevance judgments, but fail when increasing the addressed speed of reading. Therefore, the outcomes of this thesis strongly suggest to take into account reading speed when designing highly adaptive information retrieval systems.
  • Henderson, Gillian (2024)
    Mass urbanization has led to rising global energy usage, greenhouse gas emissions, and building energy inefficiency. Helsinki has set a target of achieving carbon neutrality by 2030, making it essential to modernize the old building infrastructure. This thesis focuses on the challenges of implementing energy-efficiency renovations (EERs) in housing associations in Helsinki, emphasizing the critical role of government intervention and stakeholder involvement. The study employs a comprehensive mixed-methods approach, utilizing an Integrated Meta-Theoretical Framework to analyze the techno-economic, socio-technical, and political aspects affecting EERs. The research uses qualitative and quantitative data collection methods, including surveys and expert interviews, to gather diverse perspectives on the challenges and solutions of EERs. This systematic approach aims to provide a balanced understanding of the interconnected factors influencing EERs, enabling effective interventions to support Helsinki’s climate objectives. The study identifies significant obstacles such as high initial costs, uncertain Return on Investment (ROI), technical limitations, resident resistance, lack of awareness, and inadequate government incentives. Statistical techniques and data analysis were used to quantify the impact of these barriers, and thematic analysis was used to interpret qualitative responses, providing a detailed view of the underlying issues. Recommendations include enhancing governmental support with robust financial incentives, implementing regulatory reforms to simplify renovation processes, and targeting educational programs to change homeowner perceptions toward the long-term benefits of EERs. These findings and recommendations can potentially advance urban sustainability and environmental policy, providing actionable strategies to accelerate the adoption of EERs in Helsinki and similar urban settings, thereby advancing broader sustainability and climate goals.
  • Krause, Tina (2023)
    The use of fossil fuels is a significant contributor to greenhouse gas emissions, making the transition to zero-carbon energy systems and the improvements in energy efficiency important in climate change mitigation. The energy transition also puts the citizen in a more influential position and changes the traditional dynamics between energy producers and consumers when citizens produce their own energy. Furthermore, due to the opportunity and capacity of the demand side, the energy transition places a greater emphasis on energy planning at the local level, which requires new solutions and changes in the management of the energy system. One rising phenomenon is the potential of bottom-up developments like energy communities. Within the building sector, housing cooperatives have started to emerge as energy communities, offering a way to address the energy transition through bottom-up-driven energy actions that provide renewable energy and other benefits to the local community. This master thesis examines housing cooperatives' energy communities and their role in the energy transition. The research addresses the shared renovation project in Hepokulta, Turku, seen from the energy community perspective. Furthermore, the research highlights the importance of niche development in sustainable transition, acknowledging energy communities as socio-technical niches where development is partly embedded in renewable energy technology and partly in new practices. The concept of energy community is therefore analysed through the lens of Strategic Niche Management, which focuses on expectations, networks, and learning. This research aims to analyse how residents in Hepokulta perceive the energy community project through the niche processes and how the development of energy communities might affect urban development. Analysing the residents' perceptions provide insight into the energy community characteristics and the relationship between residents and the project development processes. Additionally, the analysis identifies matters that could be changed to improve the development. The thesis is a mixed methods research combining quantitative and qualitative data, which was collected through a survey sent to the eight housing cooperatives in Hepokulta. The research showed that residents perceive the shared project in Hepokulta as essential for the area's development. Moreover, many residents overlooked the social aspects of the development, highlighting the absence of the energy community perspective in the renovation. The findings suggest some weaknesses within the three niche processes, including the early involvement of residents and communication. Furthermore, although the residents perceived themselves as important actors and the literature emphasised the importance of the demand side in future energy systems, the research revealed that the connection between project development and the residents is still lacking. However, the analysis indicates that introducing additional actors could help the energy community develop. External assistance could, for instance, benefit the housing cooperatives by facilitating improvements in the decision-making processes, the network between actors, and the sharing of information and skills.
  • Wachirapong, Fahsinee (2023)
    The importance of topic modeling in the analysis of extensive textual data is magnified by the inefficiency of manual work due to its time-consuming nature. Data preprocessing is a critical step before feeding text data to analysis. This process ensures that irrelevant information is removed and the remaining text is suitably formatted for topic modeling. However, the absence of standard rules often leads practitioners to adopt undisclosed or poorly understood preprocessing strategies. This potentially impacts the reproducibility and comparability of research findings. This thesis examines text preprocessing, including lowercase conversion, non-alphabetic removal, stopword elimination, stemming, and lemmatization, and explores their influence on data quality, vocabulary size, and topic interpretation generated by the topic model. Additionally, the variations in text preprocessing sequences and their impact on the topic model's outcomes. Our examination spans 120 diverse preprocessing approaches on the Manifesto Project Dataset. The results underscore the substantial impact of preprocessing strategies on perplexity scores and prove the challenges in determining the optimal number of topics and interpreting final results. Importantly, our study raises awareness of data preprocessing in shaping the perceived themes and content in identified topics and proposes recommendations for researchers to consider before performing data preprocessing.
  • Nurminen, Marisa (2024)
    Highly skilled migrants are an exception in otherwise tightening immigration policies in Europe, as they are welcomed and competed by most countries due to aspired effects to competitiveness and economic growth of countries. Meanwhile, the effects to shrinking and ageing population underline the need for immigration. However, policies to retain highly skilled migrants seem to lack concrete measures, and research regarding integration of highly skilled migrants seem to focus on their labor market and workplace integration as workforce and economic advantage, rather than on their integration as individuals with need to feel a sense of belonging to other parts of society, as well. Therefore, the objective of this master’s thesis is to increase the understanding of highly skilled migrants’ perceptions and experiences of the process of social integration, as well as the factors influencing to them. The theoretical and conceptual framework of this thesis is constructed around two key concepts: highly skilled migrant refers to a migrant with a tertiary education degree and specific skills, while the process of social integration is defined in this thesis as a dimension of the process of integration, which refers to the sense of belonging of the migrant, as well as to the role of society in accepting the migrant. The regional context of this thesis is Helsinki metropolitan area in Finland due to its large number of foreigners and the importance of the information and communication technology sector in the area. In order to achieve the objective of this thesis, four highly skilled migrants working in the ICT sector and living in Helsinki metropolitan area were interviewed and requested to write diaries. The gathered research material was then analyzed by applying thematic analysis. This thesis indicates that highly skilled migrants perceive social integration as a subjective process, which they experience mainly as their own responsibility. The process is, however, influenced by the attitudes, expectations and atmosphere of different levels of the society. The experiences of highly skilled migrants indicate that they are welcomed especially because of their social status as highly skilled, as the negative experiences seemed to be mainly related to their foreignness. The sense of belonging was also observed to be influenced positively by having a citizenship of a European country. Connections with Finnish people and cultural context particularly have a strong impact on the sense of belonging to society, and studies in Finland prior to working were perceived as having a positive impact to that. On the other hand, creating connections and relationships with Finns was perceived as difficult, especially due to challenges in getting to know Finns and to learn Finnish language. Although this thesis indicates that workplace can play a significant role in the process of social integration, work is perceived as only one part of life, and therefore, further examination of other dimensions of integration regarding highly skilled migrants is needed.
  • Häkkinen, Jenni (2024)
    Gravitational waves from cosmological phase transitions are a promising probe of the early universe. Many theories beyond the Standard Model predict the early universe to have undergone a cosmological first-order phase transition at the electroweak scale. This transition would have produced gravitational waves potentially detectable with the future space-based detector Laser Interferometer Space Antenna (LISA). We study the gravitational wave power spectrum generated by sound waves, which are a dominant source of gravitational waves from first-order phase transitions. We compare two methods for calculating the sound wave power spectrum: a simulation-motivated broken power-law fit of the shape of the spectrum, and a wider theoretical framework called the Sound Shell Model, which includes hydrodynamic calculations of the phase transition. We present an implementation of the Sound Shell Model into the PTPlot tool, which is currently based on the broken power-law fit. With PTPlot, we calculate the signal-to-noise ratios of LISA for the sound wave power spectrum of each method. The signal-to-noise ratio allows us to estimate the detectability of gravitational wave signals with LISA. We analyse how the detectability of certain particle physics models changes between the two different methods. Our results show that the Sound Shell Model has a potentially significant impact on the signal-to-noise ratio predictions, but it does not uniformly improve or worsen the detectability of the gravitational wave signals compared to the broken power law. The code implementation is overall successful and lays the foundation for an updated release of PTPlot and future work within this topic.
  • Heinonen, Ava (2020)
    The design of instructional material affects learning from it. Abstraction, or limiting details and presenting difficult concepts by linking them with familiar objects, can limit the burden to the working memory and make learning easier. The presence of visualizations and the level to which students can interact with them and modify them also referred to as engagement, can promote information processing. This thesis presents the results of a study using a 2x3 experimental design with abstraction level (high abstraction, low abstraction) and engagement level (no viewing, viewing, presenting) as the factors. The study consisted of two experiments with different topics: hash tables and multidimensional arrays. We analyzed the effect of these factors on instructional efficiency and learning gain, accounting for prior knowledge, and prior cognitive load. We observed that high abstraction conditions limited study cognitive load for all participants, but were particularly beneficial for participants with some prior knowledge on the topic they studied. We also observed that higher engagement levels benefit participants with no prior knowledge on the topic they studied, but not necessarily participants with some prior knowledge. Low cognitive load in the pre-test phase makes studying easier regardless of the instructional material, as does knowledge on the topic being studied. Our results indicate that the abstractions and engagement with learning materials need to be designed with the students and their knowledge levels in mind. However, further research is needed to assess the components in different abstraction levels that affect learning outcomes and why and how cognitive load in the pre-test phase affects cognitive load throughout studying and testing.
  • Korhonen, Keijo (2022)
    The variational quantum eigensolver (VQE) is one of the most promising proposals for a hybrid quantum-classical algorithm made to take advantage of near-term quantum computers. With the VQE it is possible to find ground state properties of various of molecules, a task which many classical algorithms have been developed for, but either become too inaccurate or too resource-intensive especially for so called strongly correlated problems. The advantage of the VQE comes in the ability of a quantum computer to represent a complex system with fewer so-called qubits than a classical computer would with bits, thus making the simulation of large molecules possible. One of the major bottlenecks for the VQE to become viable for simulating large molecules however, is the scaling of the number of measurements necessary to estimate expectation values of operators. Numerous solutions have been proposed including the use of adaptive informationally complete positive operator-valued measures (IC-POVMs) by García-Pérez et al. (2021). Adaptive IC-POVMs have shown to improve the precision of estimations of expectation values on quantum computers with better scaling in the number of measurements compared to existing methods. The use of these adaptive IC-POVMs in a VQE allows for more precise energy estimations and additional expectation value estimations of separate operators without any further overhead on the quantum computer. We show that this approach improves upon existing measurement schemes and adds a layer of flexibility, as IC-POVMs represent a form of generalized measurements. In addition to a naive implementation of using IC-POVMs as part of the energy estimations in the VQE, we propose techniques to reduce the number of measurements by adapting the number of measurements necessary for a given energy estimation or through the estimation of the operator variance for a Hamiltonian. We present results for simulations using the former technique, showing that we are able to reduce the number of measurements while retaining the improvement in the measurement precision obtained from IC-POVMs.
  • Silvennoinen, Meeri (2022)
    Malaria is a major cause of human mortality, morbidity, and economic loss. P. falciparum is one of six Plasmodium species that cause malaria and is widespread in sub-Saharan Africa. Many of the currently used drugs for malaria have become less effective, have adverse effects, and are highly expensive, so new ones are needed. mPPases are membrane integral pyrophosphatases that are found in the vacuolar membranes of protozoa but not in humans. These enzymes pump sodium ions and/or protons across the membrane and are crucial for parasite survival and proliferation. This makes them promising targets for new drug development. In this study we aimed to identify and characterize transient pockets in mPPases that could offer suitable ligand binding sites. P. falciparum was chosen because of its therapeutical interest, and T. maritima and V. radiata were chosen because they are test systems in compound discovery. The research was performed using molecular modelling techniques, mainly homology modelling, molecular dynamics, and docking. mPPases from three species were used to make five different systems: P. falciparum (apo closed conformation), T. maritima (apo open, open with ligand, and apo closed) and V. radiata (open with ligand). P. falciparum mPPase does not have a 3D structure available, so a homology model was built using the closest structure available from V. radiata mPPase as a template. Runs of 100 ns molecular dynamics simulations were conducted for these five systems: monomeric mPPase from P. falciparum and dimeric mPPases for the others. Two representative 3D structures for each of the five trajectories, the most dissimilar one to another, were selected for further analysis using clustering. The scrutinized 3D structures were first analyzed to identify possible binding pockets using two independent methods, SiteMap and blind docking (where no pre-determined cavity is set for docking). A second set of experiments using different scores (druggability, enclosure, exposure, …) and targeted docking were then run to characterize all the located pockets. As a result, only half of the catalytic pockets were identified. None of the transient pockets were identified in P. falciparum mPPase and all of them were located within the membrane. Docking was performed using compounds that have shown inhibiting behavior in previous studies but did not give good results in the tested structures. In the end none of the transient pockets were interesting for further study.
  • Koskinen, Anssi (2020)
    The applied mathematical field of inverse problems studies how to recover unknown function from a set of possibly incomplete and noisy observations. One example of real-life inverse problem is image destriping, which is the process of removing stripes from images. The stripe noise is a very common phenomenon in various of fields such as satellite remote sensing or in dental x-ray imaging. In this thesis we study methods to remove the stripe noise from dental x-ray images. The stripes in the images are consequence of the geometry of our measurement and the sensor. In the x-ray imaging, the x-rays are sent on certain intensity through the measurable object and then the remaining intensity is measured using the x-ray detector. The detectors used in this thesis convert the remaining x-rays directly into electrical signals, which are then measured and finally processed into an image. We notice that the gained values behave according to an exponential model and use this knowledge to transform this into a nonlinear fitting problem. We study two linearization methods and three iterative methods. We examine the performance of the correction algorithms with both simulated and real stripe images. The results of the experiments show that although some of the fitting methods give better results in the least squares sense, the exponential prior leaves some visible line artefacts. This suggests that the methods can be further improved by applying suitable regularization method. We believe that this study is a good baseline for a better correction method.
  • Merikoski, Jori (2016)
    We study growth estimates for the Riemann zeta function on the critical strip and their implications to the distribution of prime numbers. In particular, we use the growth estimates to prove the Hoheisel-Ingham Theorem, which gives an upper bound for the difference between consecutive prime numbers. We also investigate the distribution of prime pairs, in connection which we offer original ideas. The Riemann zeta function is defined as ζ(s) := \sum_{n =1}^{∞} n^{-s} in the half-plane Re s > 1. We extend it to a meromorphic function on the whole plane with a simple pole at s=1, and show that it satisfies the functional equation. We discuss two methods, van der Corput's and Vinogradov's, to give upper bounds for the growth of the zeta function on the critical strip 0 ≤ Re s ≤ 1. Both of these are based on the observation that ζ(s) is well approximated on the critical strip by a finite exponential sum \sum_{n =1}^{T} n^{-s} = \sum_{n =1}^{T} exp\{ -s log n \}. Van der Corput's method uses the Poisson summation formula to transform this sum into a sum of integrals, which can be easily estimated. This yields the estimate ζ(1/2 + it) = \mathcal{O} (t^{\frac{1}{6}} log t), as t → ∞. Vinogradov's method transforms the problem of estimating an exponential sum into a combinatorial problem. It is needed to give a strong bound for the growth of the zeta function near the vertical line Re s = 1. We use complex analysis to prove the Hoheisel-Ingham Theorem, which states that if ζ(1/2 + it) = \mathcal{O} (t^{c}) for some constant c > 0, then for any θ > \frac{1+4c}{2+4c}, and for any function x^{θ} << h(x) << x, we have ψ (x+h) - ψ (x) ∼ h, as x → ∞. The proof of this relies heavily on the growth estimate obtained by the Vinogradov's method. Here ψ(x) := \sum_{n ≤ x} Λ (n) = \sum_{p^k ≤ x} log p is the summatory function of the von Mangoldt's function. From this we obtain by using van der Corput's estimate that the difference between consecutive primes satisfies p_{n+1} - p_{n} < p_{n}^{\frac{5}{8} + \epsilon} for all large enough n, and for any \epsilon > 0. Finally, we study prime pairs, and the Hardy-Littlewood Conjecture on their distribution. More precisely, let π _{2k}(x) stand for the number of prime numbers p ≤ x such that p+2k is also a prime. The following ideas are all original contributions of this thesis: We show that the average of π _{2k}(x) over 2k ≤ x^{θ} is exactly what is expected by the Hardy-Littlewood Conjecture. Here we can choose θ > \frac{1+4c}{2+4c} as above. We also give a lower bound of π _{2k}(x) for the averages over much smaller intervals 2k ≤ E log x, and give interpretations of our results using the concept of equidistribution. In addition, we study prime pairs by using the discrete Fourier transform. We express the function π _{2k}(n) as an exponential sum, and extract from this sum the term predicted by the Hardy-Littlewood Conjecture. This is interpreted as a discrete analog of the method of major and minor arcs, which is often used to tackle problems of additive number theory.
  • Kaipio, Mikko Ari Ilmari (2014)
    This master's thesis consists of two parts related to atomic layer deposition (ALD) processes: a literature survey of so-called ex situ in vacuo analysis methods used in investigations of the ALD chemistry and a summary of the work performed by the author using in situ methods. The first part of the thesis is divided into four sections. In the first two sections ALD as a thin film deposition method is introduced, and in situ and ex situ in vacuo publications related to ALD are summarized. The third section is a general overview of ex situ in vacuo analysis methods, and the final section a literature review covering publications where ex situ in vacuo techniques have been employed in studying ALD processes, with a strong emphasis on analysis methods which are based on the use of x-rays. The second part of the thesis consists of in situ quartz crystal microbalance and quadrupole mass spectrometry studies of the V(NEtMe)4/D2O, V(NEtMe)4/O3, Mg(thd)2/TiF4 and Cu2(CH3COO)4/D2O ALD processes. The experimental apparatus and related theory are given a brief overview, followed by a presentation and discussion of the results.
  • Diseth, Anastasia Chabounina (2024)
    Combinatorial optimization problems arise in many applications. Finding solutions that are as good as possible, ideally optimal, respect to given criteria is important. Additionally, many real-world combinatorial optimization problems are NP-hard. The so-called declarative approach to solving combinatorial optimization problems has proven to be successful in practice. In this work we focus on the the implicit hitting set-based (IHS) maximum satisfiability (MaxSAT) paradigm to solving combinatorial optimization problems declaratively. In the MaxSAT paradigm the problem at hand is formulated as a linear objective function to minimize subject to a set of constraints expressed in the language of propositional logic. In the IHS approach the problem is solved by alternating calls to two subroutines. An optimizer procedure computes optimal solutions over the variables in the objective function without the constraints available and a feasibility oracle verifies the solutions in terms of the constraints. In this work we study alternative divisions of constraints of a given problem formulation between the optimizer and the oracle. We allow the optimizer to compute solutions over any variables of the problem instance, thus extending the hitting set formulations of the IHS-based MaxSAT. We focus on two specific combinatorial optimization problems and existing MaxSAT encodings of these problems. The problems focus on are computing the treewidth of a graph and finding an optimal k-undercover Boolean matrix factorization. We have also extended a state-of-the-art IHS-based MaxSAT solver to support extended divisions of encodings and provide the implementation as open source.
  • Rissanen, Olli (2014)
    Delivering more value to the customer is the goal of every software company. In modern software business, delivering value in real-time requires a company to utilize real-time deployment of software, data-driven decisions and empirical evaluation of new products and features. These practices shorten the feedback loop and allow for faster reaction times, ensuring the development is focused on features providing real value. This thesis investigates practices known as continuous delivery and continuous experimentation as means of providing value for the customers in real-time. Continuous delivery is a development practice where the software functionality is deployed continuously to customer environment. This process includes automated builds, automated testing and automated deployment. Continuous experimentation is a development practice where the entire R&D process is guided by conducting experiments and collecting feedback. As a part of this thesis, a case study is conducted in a medium-sized software company. The research objective is to analyze the challenges, benefits and organizational aspects of continuous delivery and continuous experimentation in the B2B domain. The data is collected from interviews conducted on members of two teams developing two different software products. The results suggest that technical challenges are only one part of the challenges a company encounters in this transition. For continuous delivery, the company must also address challenges related to the customer and procedures. The core challenges are caused by having multiple customers with diverse environments and unique properties, whose business depends on the software product. Some customers also require to perform manual acceptance testing, which slows down production deployments. For continuous experimentation, the company also has to address challenges related to the customer and organizational culture. An experiment which reveals value for a single customer might not reveal as much value for other customers due to unique properties in each customers business. Additionally, the speed by which experiments can be conducted is relative to the speed by which production deployments can be made. The benefits found from these practices support the case company in solving many of its business problems. The company can expose the software functionality to the customers from an earlier stage, and guide the product development by utilizing feedback and data instead of opinions.
  • Koutsompinas, Ioannis Jr (2021)
    In this thesis we study extension results related to compact bilinear operators in the setting of interpolation theory and more specifically the complex interpolation method, as introduced by Calderón. We say that: 1. the bilinear operator T is compact if it maps bounded sets to sets of compact closure. 2.\bar{ A} = (A_0,A_1) is a Banach couple if A_0,A_1 are Banach spaces that are continuously embedded in the same Hausdorff topological vector space. Moreover, if (Ω,\mathcal{A}, μ) is a σ-finite measure space, we say that: 3. E is a Banach function space if E is a Banach space of scalar-valued functions defined on Ω that are finite μ-a.e. and so that the norm of E is related to the measure μ in an appropriate way. 4. the Banach function space E has absolutely continuous norm if for any function f ∈ E and for any sequence (Γ_n)_{n=1}^{+∞}⊂ \mathcal{A} satisfying χ_{Γn} → 0 μ-a.e. we have that ∥f · χ_{Γ_n}∥_E → 0. Assume that \bar{A} and \bar{B} are Banach couples, \bar{E} is a couple of Banach function spaces on Ω, θ ∈ (0, 1) and E_0 has absolutely continuous norm. If the bilinear operator T : (A_0 ∩ A_1) × (B_0 ∩ B_1) → E_0 ∩ E_1 satisfies a certain boundedness assumption and T : \tilde{A_0} × \tilde{B_0} → E_0 compactly, we show that T may be uniquely extended to a compact bilinear operator T : [A_0,A_1]_θ × [B_0,B_1]_θ → [E_0,E_1]_θ where \tilde{A_j} denotes the closure of A_0 ∩ A_1 in A_j and [A_0,A_1]_θ denotes the complex interpolation space generated by \bar{A}. The proof of this result comes after we study the case where the couple of Banach function spaces is replaced by a single Banach space.
  • Vazquez Muiños, Henrique (2016)
    In this thesis we consider an extension of the Standard Model (SM) with a SU(2) symmetric Dark Sector, and study its viability as a dark matter (DM) model. In the dark sector, a hidden Higgs mechanism generates three massive gauge bosons, which are the DM candidates of the model. We allow a small coupling between the SM Higgs and the scalar of the dark sector, such that there is a scalar mixing. We study the new interactions in the model and analyse the consequences of the scalar mixing: new possible decays of the Higgs into DM, Higgs decay rates and production cross sections different from SM predictions, and possible interactions between DM and normal matter. We study the evolution of the DM abundance from the early universe to the present and compare the relic densities that the model yields with the experimental value measured by the Planck satellite. We compute the decay rates for the Higgs in the model and test if they are consistent with the experimental data from Atlas, CMS and Tevatron. We calculate the cross section for the interaction between DM and normal matter and compare it with the data from the latest direct detection experiments LUX and XENON100. We discuss the impact of the experimental constraints on the parameter space of the model, and find the regions that give the best fit to the experimental data. In this work we show that the agreement with the experiments is optimal when both the DM candidates and the dark scalar are heavier than the Higgs boson.
  • Laurila, Terhi (2016)
    An intense storm named Mauri swept over Lapland, Finland on the 22nd of September 1982 causing 3 Mm3 forest damage and two fatalities. There were thoughts that Mauri originated from a category 4 hurricane Debby but the linkage between Debby and Mauri and their connection to climatic conditions have not been investigated before. In this thesis, a climatic overview of September 1982 in comparison to 1981-2010 Septembers is provided. The calculations are based on ERA-Interim reanalysis data produced by European Centre for Medium-Range Weather Forecasts. The track of the storm is determined from ERA-Interim data from the time Debby occurred until Mauri crossed Finland. The evolution of Debby is also presented with the storm track data by National Oceanic and Atmospheric Administration to comparison. Extratropical transition (ET) and phase diagram of Debby and the synoptic evolution of Mauri are examined. ET is defined to start when the cyclone loses a symmetric hurricane eye feature to form asymmetric fronts, and ET is completed when the warm core of the storm turns cold. A comparison between Mauri and two other intense storms that have affected Europe is briefly presented. It was discovered, that Debby completed ET before rapidly crossing the North Atlantic. However, near the UK ex-Debby started to lose its cold core and asymmetric structure typical to an extratropical cyclone. Ex-Debby phased back to warm cored while crossing Sweden, and at the same time it was rapidly deepening up to 27 hPa in 24 hours defining the storm as a meteorological bomb. Ex-Debby developed a frontal structure along a pre-existing cold front before hitting Lapland. It merged with the pre-existing low pressure center from the Norwegian Sea and proceeded right ahead of an upper trough, a region for cyclogenesis. These made the storm, now named Mauri, more intense as it crossed Lapland, and led to 30 m/s winds based on Finnish Meteorological Institute. Meanwhile, an occluded bent-back front approached Mauri, wrapped around the storm trapping the warmer air inside it and formed a warm seclusion. Due to that, Mauri regained the symmetric structure before reaching the Barents Sea. Examining the climatic aspect, positive surface pressure and temperature anomalies over central Europe caused the jet stream to shift northward. Also, positive NAO and AO phases changed the storm track in general to higher latitudes. Hence, climatic conditions favoured the storm track to move more north. The results of this thesis suggested that Mauri was the remnant of a hurricane Debby. It was shown that ERA-Interim was successful in locating the evolution of a cyclone and analysing its structure whereas it underestimated the surface pressure and wind speed values. Future work is still needed, for instance comparing these results to different reanalyses and collecting a statistic examination of hurricane originated storms in Europe, in order to adapt these methods and climatic indicators to future cases and storm predictions.
  • Ryyppö, Timo (2012)
    Islannissa purkautui Eyjafjallajökull-niminen tulivuori 14.4.2010. Purkauksen voimakkuus ja siitä syntynyt tuhkapilvi pysäytti lähes koko Euroopan lentoliikenteen Suomi mukaan lukien. Tässä pro gradu -tutkielmassa kerrotaan, miksi juuri Islannissa purkautui tulivuori, ja mitä vaikutuksia purkauksella oli. Tutkielma esittelee erilaisia menetelmiä havainnoida ja mallintaa tuhkan kulkeutumista. Tulkitsemalla ja yhdistämällä erilaisten mallien ja menetelmien tuloksia pyritään vastaamaan kysymykseen: oliko Suomessa tuhkaa vai ei? Tutkielma painottuu kaukokartoitusmittauksiin sekä satelliiteista että maanpinnalta. Satelliittiaineisto on peräisin Ilmatieteen laitoksen (IL) Lapin ilmatieteellisessä tutkimuskeskuksessa vastaanotettavista satelliiteista. Käytetyt satelliitti-instrumentit ovat MODIS (Moderate Resolution Imaging Spectroradiometer) ja OMI (Ozone Monitoring Instrument). Maanpinnalta tehdyt kaukokartoitushavainnot on tehty Brewer-spektrofotometrilla ja PFR-aurinkofotometrillä (Precision Filter Radiometer). PFR-mittauksia on sekä Sodankylästä että Jokioisista ja Brewer-mittauksia Sodankylästä. Kaukokartoitusaineiston lisäksi tutkielmassa käytetään kahta numeerista mallia: IL:n SILAMia (System for Integrated modeLling of Atmospheric coMposition) sekä UK Met Officen NAMEa (Numerical Atmospheric-dispersion Modelling Environment). Näiden lisäksi IL:n rikkidioksidimittausverkostoa käytetään sekä vertailussa satellittiaineiston kanssa että trajektorianalyysissä. Tutkielmassa käytettiin erilaisia lähestymistapoja tuhkan olemassaolon ja määrän selvittämiseen. Minkään yksittäisen mittauksen tai mallin perusteella ei kuitenkaan voida varmasti sanoa, oliko Suomessa vulkaanista tuhkaa vai ei keväällä 2010. Tätä voidaan pitää tutkielman tärkeimpänä johtopäätöksenä. Tulivuorten purkausten havainnoimisessa ja monitoroinnissa onkin tärkeää yhdistää erilaisten mittausten, ja mallien tuloksia ja tarkastella näitä suurena kokonaisuutena.