Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Title

Sort by: Order: Results:

  • Kujanpää, Antti (2016)
    In this master's thesis we explore the mathematical model of classical Lagrangian mechanics with constraints. The main focus is on the nonholonomic case which is obtained by letting the constraint distribution to be nonintegrable. Motivation for the study arises from various physical examples, such as a rolling rigid body or a snakeboard. In Chapter 2, we introduce the model and derive the associated equations of motion in several different forms while using the Lagrangian variational principle as a basis for the kinematics. We also show how nonintegrability of the constraint distribution is linked to some external forces via the Frobenius theorem. Symmetric mechanical systems are discussed in Chapter 3. We define the concept for a Lagrangian system with constraints and show how any free and proper Lie group action induces an intrinsic vertical structure to the tangent bundle of the configuration manifold. The associated bundle is used to define the nonholonomic momentum which is a constrained version of the form that appears in the modern formulation of the classical Noether's theorem. One applies the classical Noether's theorem to a symmetric system with integrable constraints by restricting observation to an integral submanifold. This procedure, however, is not always possible. In nonholonomic mechanics, a Lie group symmetry implies only an additional equation of motion rather than actual conservation law. In Chapter 4, we introduce a coordinate free technique to split the Lagrangian variational principle in two equations, based on the Lie group invariance. The equations are intrinsic, that is to say, independent of the choice of connections, related parallel transports and covariant differentiation. The vertical projection, associated to the symmetry, may be varied to alter the representation and shift balance between the two equations. In Chapter 5, the results are applied to the rattleback which is a Lagrangian model for a rigid, convex object that rolls without sliding on a plane. We calculate the nonholonomic momentum and state the equations of motion for a pair of simple connections. One of the equation is also solved with respect to a given solution for the other one. The thesis is mainly based on the articles 'Nonholonomic Mechanical Systems with Symmetry' (A.M. Bloch, P.S. Krishnaprasad, J.E. Marsden, and R M. Murray, 1996), 'Lagrangian reduction by stages' (H. Cendra, J.E. Marsden, and T.S. Ratiu, 2001), 'Geometric Mechanics, Lagrangian Reduction and Nonholonomic Systems' (H. Cendra, J.E. Marsden, and T.S. Ratiu, 2001) and the book 'Nonholonomic mechanics and control' (A.M. Bloch, 2003).
  • Mirka, Anssi (2012)
    In this Thesis I present the general theory of semigroups of linear operators. From the philosophical point of view I begin by connecting deterministic evolution in time to dynamic laws that are stated in terms of a differential equation. This leads us to associate semigroups with the models for autonomic deterministic motion. From the historical point of view I reflect upon the history of the exponential function and its generalizations. I emphasize their role as solutions to certain linear differential equations that characterize both exponential functions and semigroups. This connection then invites us to consider semigroups as generalizations of the exponential function. I believe this angle of approach provides us with motivation as well as useful ideas. From the mathematical point of view I construct the basic elements of the theory. First I consider briefly uniformly and strongly continuous semigroups. After that I move on to the more general σ(X, F)-continuous case. Here F is a so called norming subspace of the dual X^*. I prove the existence of both the infinitesimal generator S of the semigroup and the resolvent (λ - S)^(-1) as well as some of their basic properties. Then I turn to the other direction and show how to create a semigroup starting from its generator. That is the content of the famous Hille—Yosida Theorem. From the practical point of view I give some useful characterizations of the generator in terms of dissipativity and accretivity. These techniques also lead us to an effortless proof of Stone's Theorem on unitary groups. Finally, from an illustrational point of view I give two applications. The first is about multiplicative semigroups on L^p spaces, where the setting is simple enough to allow intuition to accompany us. The second takes on a problem of generating a particular stochastic weak*-continuous semigroup. It serves to illustrate some of our results.
  • Vähä-Piikkiö, Olga (2015)
    The purpose of this study is to develop a method for optimizing the data assimilation system of the HIROMB-BOOS -model at the Finnish Meteorological Institute by finding an optimal time interval and an optimal grid for the data assimilation. This is needed to balance the extra time the data assimilation adds to the runtime of the model and the improved accuracy it provides. Data assimilation is the process of combining observations with a numerical model to improve the accuracy of the model. There are different ways of doing this, some of which are covered in this work. The HIROMB-BOOS -circulation model is a 3D-forecast model for the Baltic Sea. The variables forecast are temperature, salinity, sea surface height, currents, ice thickness and ice coverage. Some of the most important model equations are explained here. The HIROMB-BOOS -model at the Finnish Meteorological Institute has a preoperational data assimilation system that is based on the optimal interpolation method. In this study the model was run for a 2-month test period with different time intervals of data assimilation and different assimilation grids. The results were compared to data from five buoys in the Baltic Sea. The model gives more accurate results when the time interval of the data assimilation is small. The thicker the data assimilation grid is, the better the results. An optimal time interval was determined taking into account the time the assimilation takes. An optimal grid was visually determined based on an optimal grid thickness, for which the added time had to be considered as well. The optimized data assimilation scheme was tested by performing a 12-month test run and comparing the results to buoy data. The optimized data assimilation has a positive effect on the model results.
  • Marttinen, Tom-Henrik (2016)
    A model in mathematic logic is called pseudo-finite, in case it satisfies only such sentences of first-order predicate logic that have a finite model. Its main part modelled based on Jouko Väänänen's article 'Pseudo- finite model theory', this text studies classic model theory restricted to pseudo-finite models. We provide a range of classic results expressed in pseudo-finite terms, while showing that a set of other well-known theorems fail when restricted to the pseudo-finite, unless modified substantially. The main finding remains that a major portion of the classic theory, including Compactness Theorem, Craig Interpolation Theorem and Lidström Theorem, holds in an analogical form in the pseudo-finite theory. The thesis begins by introducing the basic first-order model theory with the restriction to relational formulas. This purely technically motivated limitation doesn't exclude any substantial results or methods of the first-order theory, but it simplifies many of the proofs. The introduction behind, the text moves on to present all the classic results that will later on be studied in terms of the pseudo-finite. To enable and ease this, we also provide some powerful tools, such as Ehrenfeucht-Fraïssé games. In the main part of the thesis we define pseudo-finiteness accurately and build a pseudo-finite model theory. We begin from easily adaptable results such as Compactness and Löwenheim-Skolem Theorems and move on to trickier ones, examplified by Craig Interpolation and Beth Definability. The section culminates to a Lidström Theorem, which is easy to formulate but hard to prove in pseudo-finite terms. The final chapter has two independent sections. The first one studies the requirements of a sentence for having a finite model, illustrates a construction of a finite model for a sentence that has one, and culminates into an exact finite model existence theorem. In the second one we define a class of models with a certain, island-like structure. We prove that the elements of this class are always pseudo-finite, and at the very end the text, we present a few examples of this class.
  • Turunen, Joonas (2014)
    Työssä konstruoidaan euklidisen kaksiulotteisen pallonkuoren kanssa melkein varmasti homeomorfinen satunnainen metrinen avaruus, Brownin graafi, ja tarjotaan mahdollinen diskretisaatio pallonkuorelle käyttäen neliötahkoisia tasoverkkoja. Aluksi konstruoidaan Gromovin-Hausdorffin metriikka kompaktien metristen avaruuksien joukkoon. Tämän jälkeen konstruoidaan Corin-Vauquelin-Schaefferin bijektio olennaisesti tason puiden ja neliötahkoisten tasoverkkojen joukkojen välille, missä puiden kaarien ja tasoverkkojen tahkojen lukumäärä on sama kiinnitetty luonnollinen luku ja puiden solmuihin on lisäksi liitetty kokonaisluku. Tämän bijektiivisen vastaavuuden perusteella n-tahkoisten neliötasoverkkojen lukumäärä on helppo laskea kaikille luonnollisille luvuille n. Huomataan, että tasaisesti jakautunut n-tahkoinen neliötasoverkko on satunnaismuuttuja kompaktien metristen avaruuksien avaruudessa ja todetaan, että on mielekästä tutkia näiden satunnaismuuttujien suppenemista jakauman mielessä. Sen jälkeen kun Brownin graafi on konstruoitu esitetään Jean-François Le Gall'n ja Grégory Miermont'n todistama tuore tulos, jonka mukaan Brownin graafi on sopiva skaalaraja diskreeteistä tasoverkoista jakaumien suppenemisen mielessä. Tutkielman lopuksi arvioidaan lyhyesti, kuinka hyvin Brownin graafi kuvaa tasaisesti jakautunutta satunnaista metriikkaa pallon kuorella sekä esitellään aiheeseen liittyviä avoimia ongelmia. Työn motivaationa ovat osaltaan sovellukset kvanttigravitaatioteoriaan.
  • Feng, Weihang (2023)
    After 2013, the environmental protection department in China has significantly reduced on-road emission through the upgrade of emission standards, the improvement of fuel quality and economic tools. However, the specific effect of the control policies on emission and air quality is still difficult to quantify. This is mainly due to the data shortage on vehicle emission factors and vehicle activities. In this research, we developed the 2008-2018 on-road emissions inventory based on Emission Inventory Preparation Guide (GEI) and existing vehicle activity database. Our estimates suggest that CO and PM2.5 showed a relatively significant decrease, by 66.2% and 58.8%. However, the trend of NOx (5.8%) and NMVOC (-4.8%) was relatively stable. The Beijing-Tianjin-Hebei (BTH), Yangtze River Delta (YRD), Pearl River Delta (PRD) and Sichuan Basin (SCB) regions all showed a uniform trend especially in NOx. For Beijing-Tianjin-Hebei, the significant decline in NOx might be caused by earlier implementations in emission standard and fuel quality. In addition to this, we designed additional evaporation emission scenarios to verify the application of GEI in quantify emission impact on secondary pollutant (PM2.5 and O3). The results indicate that evaporation emission contributed to Maximum Daily Average 8-hour (MDA8) O3 concentration by about 3.5%, for Beijing, Shanghai and Nanjing. This value can reach up to 5.9%, 5.3% and 7.3%, but the impact on PM2.5is extremely limited. Our results indicate the feasibility of GEI in improving and lowering the technical barrier of on-road emission inventory establishment at the same time and its further application in quantifying on-road emission contribution to air quality. Besides that, it shows a strong potential in on-road policy environmental assessment and short-term air quality assessment.
  • Pärni, Miika (2023)
    Self-Sovereign Identity is a new concept of managaging digital identities in the digital services. The purpose of the Self-Sovereign Identity is to place the user in the center and move towards decentralized model of identity management. Verifiable Credentials, Verifiable Presentations, Identity Wallets and Decentralized Identifiers are part of the Self-Sovereign Identity model. They have also been recently included in the OpenID Connect specifications to be used with the widely used authentication layer built on OAuth 2.0. The OpenID Connect authentication can now be leveraged with the Decetralized Identifiers (DIDs) and the public keys contained in DID Documents. This work assessed the feasibility of integrating the Verifiable Credentials, Verifiable Presentations and Decentralized Identifiers with OpenID Connect in the context of two use cases. The first use case is to integrate the Verifiable Credentials and Presentations into an OpenID Connect server and utilise Single Sign-On in federated environment. The second use case is to ignore the OpenID Provider and enable the Relying Party to authenticate directly with the Identity Wallet. Custom software components, the Relying Party, the Identity Wallet and the Verifiable Credential Issuer were built to support the assessments. Two new authorization flows were designed for the two use cases. The Federated Verifiable Presentation Flow describes the protocol of Relying Party authenticating with OpenID Provider which receives the user information from the Wallet. The flow does not require any changes for any Relying Party using the same OpenID Provider to authenticate and utilise Single Sign-On. The Verifiable Presentation Flow enables the Relying Party to authenticate directly with the Wallet. However, this flow requires multiple changes to Relying Party and benefits of federated environment are not available, e.g., the Single Sign-On. Both of the flows are useful for their own specific use cases. The new flows are utilising the new segments of the Self-Sovereign Identity and are promising steps towards self-sovereignty.
  • Dahl, Jani (2018)
    At the end of the inflationary epoch, about 10^(−12) seconds after the Big Bang singularity, the universe was filled with plasma consisting of quarks and gluons. At some stage the cooling of the universe could have led to the occurrence of first-order cosmological phase transitions that proceed by nucleation and expansion of bubbles all over the primordial plasma. Cosmological turbulence is generated as a consequence of bubble collisions and acts as a source of primordial gravitational waves. The purpose of this thesis is to provide an overview of cosmological turbulence as well as the corresponding gravitational wave production, and compile some of the results obtained to this day. We also touch on the onset of cosmological turbulence by analysing shock formation. In the one-dimensional case considering only right-moving waves, the result is Burgers’ equation. The development of a power spectrum with random initial conditions under Burgers’ equation is calculated numerically using the Euler method with sufficiently low step sizes. Both in the viscid and inviscid cases, the result is the presence of a −8/3 power law in the inertial range at the time of shock formation.
  • Kauvo, Sara (2022)
    Context: Factors that affect software team performance are a highly studied subject. One of the reasons for this is the subject’s meaningfulness to companies and software teams since anyone interested in improving team performance wants to know which factors affect positively on the team performance. What motivated us to do this thesis on this subject was our interest in both software teams and social sciences. Objective: This thesis’s aim was to better understand how the factors selected in our unofficial interviews will affect the software team performance and how big this affect is. These selected factors are psychological safety, team leader’s behaviour and team’s gender diversity. Method: We conducted a literature review with a keyword search. When we needed to specify the search by a factor we used factor-related words and if needed limit the subject area to computer science. All in all 23 reference papers were selected in the search. Results: Our analysis shows that all of our factors have a positive impact on the performance of the team, though how big this impact is depended on the factor. Psychological safety seems to have the biggest impact while the behaviour of team leader has a decent impact, not huge but not minuscule, lastly the gender diversity of the team has only a very small impact. Conclusions: Ultimately we have concluded that all three chosen factors have a positive effect on software team performance. Though from these three factors, psychological safety and team leader’s behaviour have the most significant impact on software team performance. So for software team leaders, it’s important to pay attention to these two factors, especially since they are even linked to each other.
  • Jarnila, Enni (2020)
    This thesis will present the concept of arbitrage and some applications of arbitrage pricing. Arbitrage opportunity means that there is a possibility to make money without any initial investment and without a risk of losing money. To start, some definitions are introduced in the fields of measure theory, probability theory and mathematical finance. Then the guidelines of market models considered throughout the thesis will be defined. The mathematical definition of arbitrage and arbitrage pricing are introduced first in simple setting of one period market model and then in multi-period market model. As a main result of this thesis are introduced and proven The fundamental theorems of arbitrage pricing. The first fundamental theorem of arbitrage pricing shows that a market is arbitrage free if and only if there exists at least one risk neutral probability measure equivalent to original probability measure such that the discounted prices are martingales with respect to this risk neutral measure. This will be proven for multi-period market model. The second fundamental theorem of arbitrage pricing shows that the completeness of a market model is equivalent to existence of unique risk-neutral probability measure. This will be proven for one period market model. Finally, I look into some investing and hedging strategies replicating payoffs and portfolio insurance. Some examples of commonly used options strategies will be introduced such as butterfly spread and iron condor.
  • Hitruhin, Lauri (2012)
    In this paper we define and study the Julia set and the Fatou set of an arbitrary polynomial f, which is defined on the closed complex plane and whose degree is at least two. We are especially interested in the structure of these sets and in approximating the size of the Julia set. First, we define the Julia and Fatou sets by using the concepts of normal families and equicontinuity. Then we move on to proving many of the essential facts concerning these sets, laying foundations for the main theorems of this paper presented in the fifth chapter. By the end of this chapter we achieve quite a good understanding of the basic structure of the Julia set and the Fatou set of an arbitrary polynomial f. In the fourth chapter we introduce the Hausdorff measure and dimension along with some theorems regarding them. In this chapter we also say more about fractals and self-similar sets, for example the Cantor set and the Koch curve. The main goal of this chapter is to prove a well-known result which allows to easily determine the Hausdorff dimension of any self-similar set that fulfils certain conditions. We end this chapter by calculating the Hausdorff dimension of the one-third Cantor set and the Koch-curve by using the result described earlier and notice, that their Hausdorff dimension is not integer-valued. In the fifth chapter we study the structure of the Julia set further, concentrating on its connectedness, and introduce the Mandelbrot set. In this chapter we also prove the three main theorems of this paper. First we show a sufficient condition for the Julia set of a polynomial to be totally disconnected. This result, with some theorems proven in the third chapter, shows that in this case the Julia set is a Cantor-like set. The second result shows when the Julia set of a quadratic polynomial of the form f(z) = z^2 + c is a Jordan curve. The third and final result shows that given an arbitrary polynomial f, there exists a lower bound for the Hausdorff dimension of the Julia set of the polynomial f, which depends on the polynomial f. This is the most important result of this paper.
  • Sinko, Jaakko (2020)
    The purpose of this thesis is to act as a guide for the 2017 article A study guide for the l^2 decoupling theorem by J. Bourgain and C. Demeter. However, this thesis is self-sufficient. The aim has been to give a detailed presentation and handle the weight exponent E especially carefully in the arguments. We begin by presenting the decoupling inequality of the l^2 decoupling theorem and the associated Fourier transform -like operator. The theorem concerns finding a satisfactory upper bound for the decoupling constant related to the inequality. We also list some general results that a graduate student might not be very familiar with; among them are a few consequences of Hölder's inequality. We move on to study the properties of the weight functions that we use in the L^p-norms in the decoupling. We present two operator lemmas to which we can reduce many of our arguments. The other lemma gives us the opportunity to use certain Schwartz functions in our proofs. We then move on to prove the l^2 decoupling theorem in the lower range 2<= p <= (2n)/(n-1). This includes the definition of multilinear decoupling constants and an iterative process.
  • Pirttinen, Nea (2020)
    Crowdsourcing has been used in computer science education to alleviate the teachers’ workload in creating course content, and as a learning and revision method for students through its use in educational systems. Tools that utilize crowdsourcing can act as a great way for students to further familiarize themselves with the course concepts, all while creating new content for their peers and future course iterations. In this study, student-created programming assignments from the second week of an introductory Java programming course are examined alongside the peer reviews these assignments received. The quality of the assignments and the peer reviews is inspected, for example, through comparing the peer reviews with expert reviews using inter-rater reliability. The purpose of this study is to inspect what kinds of programming assignments novice students create, and whether the same novice students can act as reliable reviewers. While it is not possible to draw definite conclusions from the results of this study due to limitations concerning the usability of the tool, the results seem to indicate that novice students are able to recognise differences in programming assignment quality, especially with sufficient guidance and well thought-out instructions.
  • Myller, Mika (Helsingin yliopistoUniversity of HelsinkiHelsingfors universitet, 2005)
  • Vartiainen, Pyörni (2024)
    Sums of log-normally distributed random variables arise in numerous settings in the fields of finance and insurance mathematics, typically to model the value of a portfolio of assets over time. In particular, the use of the log-normal distribution in the popular Black-Scholes model allows future asset prices to exhibit heavy tails whilst still possessing finite moments, making the log-normal distribution an attractive assumption. Despite this, the distribution function of the sum of log-normal random variables cannot be expressed analytically, and has therefore been studied extensively through Monte Carlo methods and asymptotic techniques. The asymptotic behavior of log-normal sums is of especial interest to risk managers who wish to assess how a particular asset or portfolio behaves under market stress. This motivates the study of the asymptotic behavior of the left tail of a log-normal sum, particularly when the components are dependent. In this thesis, we characterize the asymptotic behavior of the left and right tail of a sum of dependent log-normal random variables under the assumption of a Gaussian copula. In the left tail, we derive exact asymptotic expressions for both the distribution function and the density of a log-normal sum. The asymptotic behavior turns out to be closely related to Markowitz mean-variance portfolio theory, which is used to derive the subset of components that contribute to the tail asymptotics of the sum. The asymptotic formulas are then used to derive expressions for expectations conditioned on log-normal sums. These formulas have direct applications in insurance and finance, particularly for the purposes of stress testing. However, we call into question the practical validity of the assumptions required for our asymptotic results, which limits their real-world applicability.
  • Arola, Aleksi (2021)
    Freshwater ecosystems are an important part of the carbon cycle. Boreal lakes are mostly supersaturated with CO2 and act as sources for atmospheric CO2. Dissolved CO2 exhibits considerable temporal variation in boreal lakes. Estimates for CO2 emissions from lakes are often based on surface water pCO2 and modelled gas transfer velocities (k). The aim of this study was to evaluate the use of a water column stratification parameter as proxy for surface water pCO2 in lake Kuivajärvi. Brunt-Väisälä frequency (N) was chosen as the measure of water column stratification due to simple calculation process and encouraging earlier results. The relationship between N and pCO2 was evaluated during 8 consecutive May–October periods between 2013 and 2020. Optimal depth interval for N calculation was obtained by analysing temperature data from 16 different measurement depths. The relationship between N and surface pCO2 was studied by regression analysis and effects of other environmental conditions were also considered. Best results for the full study period were obtained via linear fit and N calculation depth interval spanning from 0.5 m to 12 m. However, considering only June–October periods resulted in improved correlation and the relationship between the variables more closely resembling exponential decay. There was also strong inter-annual variation in the relationship. The proxy often underestimated pCO2 values during the spring peak, but provided better estimates in summer and autumn. Boundary layer method (BLM) was used with the proxy to estimate CO2 flux, and the result was compared to fluxes from both BLM with measured pCO2 and eddy covariance (EC) technique. Both BLM fluxes compared poorly with the EC flux, which was attributed to the parametrisation of k.
  • Flinck, Jens (2023)
    This thesis focuses on statistical topics that proved important during a research project involving quality control in chemical forensics. This includes general observations about the goals and challenges a statistician may face when working together with a researcher. The research project involved analyzing a dataset with high dimensionality compared to the sample size in order to figure out if parts of the dataset can be considered distinct from the rest. Principal component analysis and Hotelling's T^2 statistic were used to answer this research question. Because of this the thesis introduces the ideas behind both procedures as well as the general idea behind multivariate analysis of variance. Principal component analysis is a procedure that is used to reduce the dimension of a sample. On the other hand, the Hotelling's T^2 statistic is a method for conducting multivariate hypothesis testing for a dataset consisting of one or two samples. One way of detecting outliers in a sample transformed with principal component analysis involves the use of the Hotelling's T^2 statistic. However, using both procedures together breaks the theory behind the Hotelling's T^2 statistic. Due to this the resulting information is considered more of a guideline than a hard rule for the purposes of outlier detection. To figure out how the different attributes of the transformed sample influence the number of outliers detected according to the Hotelling's T^2 statistic, the thesis includes a simulation experiment. The simulation experiment involves generating a large number of datasets. Each observation in a dataset contains the number of outliers according to the Hotelling's T^2 statistic in a sample that is generated from a specific multivariate normal distribution and transformed with principal component analysis. The attributes that are used to create the transformed samples vary between the datasets, and in some datasets the samples are instead generated from two different multivariate normal distributions. The datasets are observed and compared against each other to find out how the specific attributes affect the frequencies of different numbers of outliers in a dataset, and to see how much the datasets differ when a part of the sample is generated from a different multivariate normal distribution. The results of the experiment indicate that the only attributes that directly influence the number of outliers are the sample size and the number of principal components used in the principal component analysis. The mean number of outliers divided by the sample size is smaller than the significance level used for the outlier detection and approaches the significance level when the sample size increases, implying that the procedure is consistent and conservative. In addition, when some part of the sample is generated from a different multivariate normal distribution than the rest, the frequency of outliers can potentially increase significantly. This indicates that the number of outliers according to Hotelling's T^2 statistic in a sample transformed with principal component analysis can potentially be used to confirm that some part of the sample is distinct from the rest.
  • Jentze-Korpi, Suvi (2017)
    Tämä Pro Gradu -tutkielma esittelee kahta käytössä olevaa lääketieteellistä käsitejärjestelmää (ICD ja SNOMED CT), sekä vertailee niiden eroja ja käyttötarkoituksia. Ontologioiden kehitystä seurataan esimerkin kautta relaatiomallista OWL -kieleen. Lisäksi esitellään RIM -malli, joka mallintaa terveydenhuoltoon liittyvää tietoa. Kliinisten dokumenttien laatu- ja sisältövaatimuksia verrataan kliinisten dokumenttien tallennukseen ja käyttöön tarkoitetun CDA -arkkitehtuurin ominaisuuksiin. Tutkielmassa pohditaan ontologiaperusteisen tietomallin ja lääketieteellisten käsitejärjestelmien yhteiskäyttöä. Kahden esimerkkitapauksen avulla käydään läpi CDA-arkkitehtuurin ja lääketieteellisten käsitejärjestelmien yhteensovittamistapoja, sekä käsitellään toteutukseen liittyviä huomioita. Lisäksi käsitellään uuden tietojärjestelmän määrittelyyn ja käyttöönottoon liittyviä näkökulmia ja valtakunnallisen käyttöönoton vaikutuksia.
  • Törnroos, Juha (2012)
    Perinteinen tekstihaku vertaa toisiinsa tekstistä löytyviä merkkijonoja, jolloin esimerkiksi hakusanalla 'Nokia' voidaan tulokseksi saada dokumentteja matkapuhelinvalmistajasta, Nokian kaupungista tai F.E Sillanpään Ihmiset suviyössä teoksen päähenkilöstä. Tässä tutkielmassa esitetään informaation haussa (engl. Information Retrieval, IR) käytettävä menetelmä, jolla on mahdollista hakea tekstidokumentteja tarkasti määritellyllä käsitteellä. Tarkasti määritellyllä käsitteellä tarkoitetaan ontologiassa, koneymmärrettävässä sanastossa, määriteltyä käsitettä. Tässä tutkielmassa keskitytään erityisesti historiaontologiassa määriteltyihin tapahtumiin. Tutkielmassa esitetty menetelmä pyrkii tunnistamaan dokumentissa esiintyvät käsitteet sanoja ympäröivän semantiikan perusteella. Täsmällisesti sanaa ympäröivä semantiikka saadaan niin kutsutusta semanttisesta avaruudesta, joka muodostetaan piilevän semantiikan analyysiksi (engl. Latent Semantic Analysis, LSA) kutsutulla matemaattisella menetelmällä, ja ympäröivää semantiikkaa sovelletaan ontologiseen kyselyn laajentamiseen. Mallin toimivuutta pyrittiin arvioimaan koejärjestelyllä, jossa aineistona käytetään Suomalaista historiaontologiaa ja suomenkielisen Wikipedia-tietosanakirjan artikkeleita. Koejärjestelyssä ilmenneiden vaikeuksien vuoksi toimivuuden arviointi jäi puutteelliseksi. Tutkielman lopussa on pohdittu menetelmän merkitystä informaation haussa yleisesti, sillä tutkielmassa kuvattu menetelmä ontologiassa määriteltyjen käsitteiden kuvaamisesta tekstidokumenttien määräämään semanttiseen avaruuteen on uusi, eikä aiempaa tutkimusta menetelmän toiminnasta tai kehittämisestä ole tehty.
  • Koivurova, Antti (2021)
    This thesis surveys the vast landscape of uncertainty principles of the Fourier transform. The research of these uncertainty principles began in the mid 1920’s following a seminal lecture by Wiener, where he first gave the remark that condenses the idea of uncertainty principles: "A function and its Fourier transform cannot be simultaneously arbitrarily small". In this thesis we examine some of the most remarkable classical results where different interpretations of smallness is applied. Also more modern results and links to active fields of research are presented.We make great effort to give an extensive list of references to build a good broad understanding of the subject matter.Chapter 2 gives the reader a sufficient basic theory to understand the contents of this thesis. First we talk about Hilbert spaces and the Fourier transform. Since they are very central concepts in this thesis, we try to make sure that the reader can get a proper understanding of these subjects from our description of them. Next, we study Sobolev spaces and especially the regularity properties of Sobolev functions. After briefly looking at tempered distributions we conclude the chapter by presenting the most famous of all uncertainty principles, Heisenberg’s uncertainty principle.In chapter 3 we examine how the rate of decay of a function affects the rate of decay of its Fourier transform. This is the most historically significant form of the uncertainty principle and therefore many classical results are presented, most importantly the ones by Hardy and Beurling. In 2012 Hedenmalm gave a beautiful new proof to the result of Beurling. We present the proof after which we briefly talk about the Gaussian function and how it acts as the extremal case of many of the mentioned results.In chapter 4 we study how the support of a function affects the support and regularity of its Fourier transform. The magnificent result by Benedicks and the results following it work as the focal point of this chapter but we also briefly talk about the Gap problem, a classical problem with recent developments.Chapter 5 links density based uncertainty principle to Fourier quasicrystals, a very active field of re-search. We follow the unpublished work of Kulikov-Nazarov-Sodin where first an uncertainty principle is given, after which a formula for generating Fourier quasicrystals, where a density condition from the uncertainty principle is used, is proved. We end by comparing this formula to other recent formulas generating quasicrystals.