Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Title

Sort by: Order: Results:

  • Takko, Heli (2021)
    Quantum entanglement is one of the biggest mysteries in physics. In gauge field theories, the amount of entanglement can be measured with certain quantities. For an entangled system, there are correlations with these measured quantities in both time and spatial coordinates that do not fit into the understanding we currently hold about the locality of the measures and correlations. Difficulties in obtaining probes for entanglement in gauge theories arise from the problem of nonlocality. It can be stated as the problem of decomposing the space of the physical states into different regions. In this thesis, we focus on a particular supersymmetric Yang-Mills theory that is holographically dual to a classical gravity theory in an asymptotically anti de Sitter spacetime. We introduce the most important holographic probes of entanglement and discuss the inequalities obtained from the dual formulation of the entanglement entropy. We introduce the subregion duality as an interesting conjecture of holography that remains under research. The understanding of the subregion duality is not necessarily solid in arbitrary geometries, as new results that suggest either a violation of the subregion duality or act against our common knowledge of the holography by reconstructing the bulk metric beyond the entanglement wedge. This thesis will investigate this aspect of subregion duality by evaluating the bulk probes such as Wilson loop for two different geometries (deconfining and confining). We aim to find whether or not these probes remain inside of the entanglement wedge. We find that, for both geometries in four dimensions, the subregion duality is not violated. In other words, the reduced CFT state does not encode information about the bulk beyond the entanglement wedge. However, we can not assume this is the case with arbitrary geometries and therefore, this topic will remain under our interest for future research.
  • Kivelä, Feliks (2022)
    The crystal structure of magnetite (Fe3O4) involves Fe2+ ions in sites with octahedral (Oh) symmetry and Fe2+ and Fe3+ ions in sites with tetrahedral (Td) symmetry. Magnetite exhibits several interesting physical phenomena, such as the Verwey transition, in which the roles of the different Fe sites are an active subject of research. In the X-ray standing wave (XSW) technique, incoming and diffracted X-ray beams interfere inside a crystal, creating a standing wave with the periodicity of the diffracting atomic lattice. The phase of the wave, i.e. whether the nodes are located on the lattice planes or between them, can be adjusted by finely tuning the diffraction angle. Changing the phase in this way makes it possible to selectively vary the contributions of different atoms and absorption types (dipole versus quadrupole) to the measured total absorption spectrum. Iron K-edge absorption spectra in magnetite were studied in the presence of an XSW in an experiment conducted at the European Synchrotron Radiation Facility (ESRF) in Grenoble, France. This thesis presents an analysis of the data gathered during the experiment, with the goal of decomposing the experimentally measured pre-edge peak into its constituent components. The methods used in the analysis include principal component analysis and fitting predicted absorption peaks calculated with the Quanty software to the experimental data. The results show the dipole and quadrupole contributions of the tetrahedral sites responding to changes in the phase of the XSW in opposite ways in a manner consistent with theoretical predictions.
  • Walia, Parampreet Singh (2013)
    We study the possible initial conditions of the universe and the possibility of isocurvature perturbations in the early universe through CMB data. We consider three isocurvature modes; Cold Dark Matter Density Isocurvature (CDI) mode, Neutrino Density Isocurvature (NDI) mode and Neutrino Velocity Isocurvature (NVI) mode. We use three CMB datasets WMAP, QUaD and ACBAR data to constrain the (possibly) correlated adiabatic and isocurvature models. For CDI and NDI models we use both a phenomenological approach, where primordial perturbations are parametrized in terms of amplitudes at two different scales, and a slow-roll two-field inflation approach. For the NVI model we only use the phenomenological approach, since NVI mode would occur only after neutrino decoupling, i.e., after inflation. We find that larger isocurvature fractions are allowed in NDI and NVI models than in corresponding CDI models. For generally correlated perturbations, we find the upper limit to the CDM density, neutrino density and neutrino velocity isocurvature fraction to be 4.5%, 9.8% and 12.4% respectively at k = 0.002 Mpc−1 . Analysis has also been done for the special cases of uncorrelated and fully (anti) correlated perturbations. We find no clear preference for non-zero isocurvature fraction for the models considered. We find that the odds for a correlated isocurvature model compared to the standard adiabatic model are very low. We conclude that the present data does support the standard adiabatic model.
  • Adio, Luqmon (2019)
    Particle Induced X-ray Emission (PIXE) was originally introduced as an ion-beam analytical technique in Lund in the 1970s and has since then been part of the available techniques in many laboratories around the world. The external beam PIXE set-up is used in probing the annual tree rings. The goal is to see the effects of volcanic eruption activities from the perspectives of tree plants here in Finland. In the theory part, I tried to include the description of how volcanoes are formed and created with a bit of volcanic activity history, the growth metabolism mechanism in tree plants and characteristics x-ray productions. The two tree sample used for this experiment were gotten from two different regions of Finland. The first tree is a Pine tree from Parikkala(a small place near Savolinna) in the south-eastern part of Finland and the second tree is a Spruce tree from Pielavesi (place near Kuopio) in the central part of Finland. These samples were carefully prepared for ionisation. The collected spectra data were analysed in a software called PyMCA. PyMCA has been developed by the Software Group of the European Synchrotron Radiation Facility (ESRF). PyMCA is a ready to use and in many aspects state-of-the-art, set of applications implementing most of the needs of X-ray fluorescence data analysis. PyMCA is use to interpret X-ray fluorescence spectra from a diverse array of samples
  • Olander, Amanda (2022)
    Enligt såväl läroplanen för grundskolan som gymnasiet hör problemlösning till en av förmågorna som ska läras ut (Läroplanen, 2014, 2019). Ju mera studeranden själva får pröva, göra och förstå vid problemlösning desto mera givande blir processen. Motivationen för matematik ökar (Lambdin, 2003) och lärandet blir långsiktigt (Läroplanen, 2019). Detta lade grunden till denna avhandling. I avhandlingen har jag använt mig av Pólyas problemlösningsmodell från år 1973 för att ge en inblick i problemlösning i praktiken. Modellen består av fyra steg: Förstå problemet, göra upp en plan, genomförandet av planen samt reflektering över lösningen. Avhandlingens matematiska del behandlar fyra delområden i sannolikhetslära i gymnasiet. Klassisk sannolikhet, kombinatorik, statistisk sannolikhet och betingad sannolikhet behandlas med exempel, tabeller och figurer. I slutet av detta kapitel behandlas sannolikhetslärans icke-intuitiva karaktär och vanliga missuppfattningar i sannolikhetslära tas upp på basen av tidigare forskning och teori. På basen av sannolikhetslärans karaktär och missuppfattningar presenteras möjligheter att motverka dessa och underlätta undervisningen i sannolikhetslära med hjälp av problemlösning och kommentarer i följande kapitel. I avhandlingens sista kapitel presenteras fyra problem i sannolikhetslära, ett problem för varje delområde i sannolikhetslära. Problemen diskuteras med hjälp av Pólyas problemlösningsmodell och modellösningar med figurer och tabeller presenteras för varje problem.
  • Soininvaara, Katri (2017)
    In condition-based maintenance data is collected from a machine to provide advice on frequency and location of developing faults. Statistical inference is needed to transform the data into information on the health of the machine. The ultimate goal is to minimise the machine down-time due to unexpected breakage. Predictive maintenance attempts to forecast the condition of the machine components from the observed data, and to maintain the machine just before it breaks down. The research question this thesis aims to solve is how to diagnose and predict component health based on data collected from the machine. Based on the literature, hidden Markov model is selected for further study. There is usually uncertainty relating to the parameters and structure of the model due to the complicated causal relationships in the modelling problem. Therefore the thesis concentrates in finding a suitable inference algorithm which is able to learn the model from data. Six different frequentist and Bayesian algorithms are tested with a synthetic example. A hypothesis is put forward that a hybrid genetic variational Bayesian algorithm could be used to find the best performing hidden Markov model of component health. As expected, the hybrid variational algorithm performs better than the other examined algorithms, especially when there is uncertainty relating to the model structure. However, since there typically is an imbalance between the data depicting faults and the data depicting the normal behaviour, the simulated test case shows that even the best performing variational algorithm has difficulties in identifying the correct model. This results in increased uncertainty in the health predictions. The thesis confirms that the hidden Markov model has many good qualities for modelling component health based on remote monitoring data. Due to the versatility of the model, it can be modified to account for the many details of component degradation behaviour in different machines.
  • Lampuoti, Jarkko (2021)
    Scandium-44 is a medically interesting positron and gamma emitting radionuclide with possible applications in molecular imaging. It is commonly produced with the use of a cyclotron in a calcium or sometimes a titanium based irradiation target. As the radiopharmaceutical use of scandium radionuclides commonly requires chelation, scandium needs to be separated from the target matrix. This is most often carried out either via extraction chromatography using a suitable solid phase or through precipitation-filtration. In this work, scandium-44 along with other scandium radionuclides was produced using cyclotron irradiation with 10 MeV protons and a solid, natural isotopic abundance calcium carbonate or calcium metal target. Scandium was separated from the irradiated targets using four different chromatographic materials and a precipitation method. Scandium-44 was produced in kilo- and megabecquerel amounts with an average saturation yield of 47 MBq/μA. The achieved separation yields in a single elution ranged from 28 ± 11 % to 70 ± 20 % with the best performing extraction material being UTEVA resin.
  • Kivi, Karita (2024)
    As a company's Information Technology (IT) solutions become more complex, the importance of good IT support systems increases. The versatility of IT solutions in production environments is constantly increasing, as the systems are very long-lasting and increasingly incorporate technology. Carefully built support capability is able to take over the environment comprehensively, and provide support for cooperation between different stakeholders. This thesis lays the groundwork for a case company to enhance the support capabilities of IT services in production environments. In order for a comprehensive take-over to be successful, it is important to first understand what are the elements of the environment and what are their relationships to each other. Tools supporting environmental technology mapping will be developed to understand what kind of support would be needed at each point in time. The methods developed, as well as the definition of past problem cases and solutions collected through interviews. Based on the problems and solutions detected, we can build a framework of standard solutions: what type of standard solutions would improve the production support. Based on the standards, we are able to create a responsibility matrix that identifies stakeholders for the development of the standards. The research explores extensively the structure of IT services in production environments and the benefits of standardizing different aspects. Our case company is a major player in the defense industry, allowing us to figure out what kind of special requirements and characteristics exist in this closely controlled sector.
  • Hämäläinen, Jussi (2024)
    In this thesis, we aim to introduce the reader to profinite groups. Profinite groups are defined by two characteristics: firstly, they have a topology defined on them (notably, they are compact). Secondly, they are constructed from some collection of finite groups, each equipped with a discrete topology and forming what is known as an inverse system. The profinite group emerges as an inverse limit of its constituent groups. This definition is, at this point, necessarily quite abstract. Thus, before we can really understand profinite groups we must examine two areas: first, we will study topological groups. This will give us the means to deal with groups as topological spaces. Topological groups have some characteristics that differentiate them from general topological spaces: in particular, a topological group is always a homogeneous space. Secondly, we will explore inverse systems and inverse limits, which will take us into category theory. While we could explain these concepts without categories, this thesis takes the view that category theory gives us a useful “50000-feet view” by giving these ideas a wider mathematical context. In the second chapter, we will go through preliminary information concerning group theory, general topology and category theory that will be needed later. We will begin with some basic concepts from group theory and point-set topology. These sections will mostly contain information that is familiar from the introductory university courses. The chapter will then continue by introducing some basic concepts of category theory, including inverse systems and inverse limits. For these, we will give an application by showing how the Cantor set is homeomorphic to an inverse limit of a collection of finite sets. In the third chapter, we will examine topological groups and prove some of their properties. In the fourth chapter, we will introduce an example of profinite groups: Zp, the additive group of p-adic integers. This will be expanded into a ring and then into the field Qp. We will discuss the uses of Zp and Qp and show how to derive them as an inverse limit of finite, compact groups.
  • Speer, Jon (2020)
    The techniques used to program quantum computers are somewhat crude. As quantum computing progresses and becomes mainstream, a more efficient method of programming these devices would be beneficial. We propose a method that applies today’s programming techniques to quantum computing, with program equivalence checking used to discern between code suited for execution on a conventional computer and a quantum computer. This process involves determining a quantum algorithm’s implementation using a programming language. This so-called benchmark implementation can be checked against code written by a programmer, with semantic equivalence between the two implying the programmer’s code should be executed on a quantum computer instead of a conventional computer. Using a novel compiler optimization verification tool named CORK, we test for semantic equivalence between a portion of Shor’s algorithm (representing the benchmark implementation) and various modified versions of this code (representing the arbitrary code written by a programmer). Some of the modified versions are intended to be semantically equivalent to the benchmark while others semantically inequivalent. Our testing shows that CORK is able to correctly determine semantic equivalence or semantic inequivalence in a majority of cases.
  • Siipola, Sade-Tuuli (2023)
    Data centers provide a demanding and complex environment for networking as there is a need to provide fairness, throughput, and responsiveness while balancing great volumes of data and different types of flows. Programmable scheduling aims to make networking more flexible by providing capabilities for testing, modifying, and running a greater number of scheduling algorithms on switches than currently is possible. This is done by having a hardware design on top of which scheduling algorithms can be run as software. Over the years, multiple different abstractions for the switch scheduler have been suggested, with the aim of being capable of running at line rate. This thesis is a literature review of different programmable scheduler designs, focusing on Push-In First-Out, Push-In Extract-Out, Strict Priority Push-In First-Out, and Admission-In First-Out designs. This work provides an overview of the designs and their hardware implementations, observing their strengths and weaknesses regarding the data center environment. These designs are compared to one another with a focus on trade-offs between metrics like speed, expressiveness, and scalability, with a discussion on how these trade-offs ensure that there is currently no design that is above the others in all aspects.
  • Puro, Touko (2023)
    GPUs have become an important part of large-scale and high-performance physics simulations, due to their superior performance [11] and energy effiency [23] over CPUs. This thesis examines how to accelerate an existing CPU stencil code, that is originally parallelized through message passing, with GPUs. Our first research question is how to utilize the CPU cores alongside GPUs when the bulk of the computation is happening on GPUs. Secondly, we investigate how to address the performance bottleneck of data movement between CPU and GPU when there is a need to perform computational tasks originally intended to be executed on CPUs. Lastly, we investigate how the performance bottleneck of communication between processes can be alleviated to make better use of the available compute resources. In this thesis we approach these problems by building a preprocessor designed for making an existing CPU codebase suitable for GPU acceleration and the communication bottleneck is alleviated through extending a existing GPU oriented library Astaroth. We improve its task scheduling system and extend its own domain specific language (DSL) for stencil computations. Our solutions are demonstrated by making an existing CPU based astrophysics simulation code Pencil Code [4] suitable for GPU acceleration with the use of our preprocessor and the Astaroth library. Our results show that we are able to utilize CPU cores to perform useful work alongside the GPUs. We also show that we are able to circumvent the CPU-GPU data movement bottleneck by making code suitable for offloading through OpenMP offloading and code translation to GPU code. Lastly, we show that in certain cases Astaroth’s communication performance is increased by around 21% through smaller message sizes — with the added benefit of 14% lower memory usage, which corresponds to around 18% improvement in overall performance. Furthermore, we show benefits of the improved tasking and a identified memoryperformance trade-off.
  • Olander, Tom (2020)
    Denna avhandling ställer frågor kring vad det innebär för den långa matematiken i gymnasiet när programmering tillkommer enligt läroplanen. Först granskas den tidigare forskningen kring programmering och matematik i gymnasiet och till vilka slutsatser man kommit i dessa. Att programmering kan vara till nytta för studeranden i yrkeslivet är givet, men huruvida programmering skall höra till matematiken är en av huvudfrågorna i avhandlingen. Eftersom det inte finns mycket forskning kring programmering i matematik i Finland har här även används forskning gjort i andra länder. Samma sak gäller för den aktuella åldersgruppen, därför har även forskning med studeranden i ungefär samma ålder använts. Inget undervisningsmaterial ännu finns för programmeringen i gymnasiet. Därför finns här också förslag till uppgiftstyper som kunde användas i undervisningen. Dessa exempel får fritt modifieras och användas som hjälp vid planering av undervisningen. En annan användning av denna avhandling kunde vara att låta den vara som grund för kommande planering av programmering i gymnasiet angående så väl i vilka ämnen programmering hanteras och hur denna undervisning
  • Ahlskog, Niki (2019)
    Progressiivisen web-sovelluksen (Progressive Web Application, PWA) tarkoitus on hämärtää tai jo- pa poistaa raja sovelluskaupasta ladattavan sovelluksen ja normaalin verkkosivuston välillä. PWA- sovellus on kuin mikä tahansa normaali verkkosivusto, mutta se täyttää lisäksi seuraavat mitta- puut: Sovellus skaalautuu mille tahansa laitteelle. Sovellus tarjotaan salatun yhteyden yli. Sovellus on mahdollista asentaa puhelimen kotinäytölle pikakuvakkeeksi, jolloin sovellus avautuu ilman se- laimesta tuttuja navigointityökaluja ja lisäksi sovelluksen voi myös avata ilman verkkoyhteyttä. Tässä työssä käydään läpi PWA-sovelluksen rakennustekniikoita ja määritellään milloin sovellus on PWA-sovellus. Työssä mitataan PWA-sovelluksen nopeutta Service Workerin välimuistitallen- nusominaisuuksien ollessa käytössä ja ilman. PWA-sovelluksen luomista ja käyttöönottoa tarkastel- laan olemassa olevassa yksityisessä asiakasprojektissa. Projektin tarkastelussa kiinnitetään huomio- ta PWA-sovelluksen tuomiin etuihin ja kipupisteisiin. Tuloksen arvioimiseksi otetaan Google Chromen Lighthouse -työkalua käyttäen mittaukset sovel- luksen progressiivisuudesta ja nopeudesta. Lisäksi sovellusta vasten ajetaan Puppeteer-kirjastoa hyödyntäen latausnopeuden laskeva testi useita kertoja sekä tarkastellaan PWA-sovelluksen Service Workerin välimuistin hyödyllisyyttä suorituskyvyn ja latausajan kannalta. Jotta Service Workerin välimuistin käytöstä voidaan tehdä johtopäätökset, nopeuden muutosta tarkastellaan progressii- visten ominaisuuksien ollessa käytössä ja niiden ollessa pois päältä. Lisäksi tarkastellaan Googlen tapaustutkimuksen kautta Service Workerin vaikutuksia sovelluksen nopeuteen. Testitulokset osoittavat että Service Workerin välimuistin hyödyntäminen on nopeampaa kaikissa tapauksissa. Service Workerin välimuisti on nopeampi kuin selaimen oma välimuisti. Service Worker voi myös olla pysähtynyt ja odotustilassa käyttäjän selaimessa. Silti Service Workerin aktivoimi- nen ja välimuistin käyttäminen on nopeampaa kuin selaimen välimuistista tai suoraan verkosta lataaminen.
  • Ahlskog, Niki (2019)
    Progressiivisen web-sovelluksen (Progressive Web Application, PWA) tarkoitus on hämärtää tai jopa poistaa raja sovelluskaupasta ladattavan sovelluksen ja normaalin verkkosivuston välillä. PWA-sovellus on kuin mikä tahansa normaali verkkosivusto, mutta se täyttää lisäksi seuraavat mittapuut: Sovellus skaalautuu mille tahansa laitteelle. Sovellus tarjotaan salatun yhteyden yli. Sovellus on mahdollista asentaa puhelimen kotinäytölle pikakuvakkeeksi, jolloin sovellus avautuu ilman selaimesta tuttuja navigointityökaluja ja lisäksi sovelluksen voi myös avata ilman verkkoyhteyttä. Tässä työssä käydään läpi PWA-sovelluksen rakennustekniikoita ja määritellään milloin sovelluson PWA-sovellus. Työssä mitataan PWA-sovelluksen nopeutta Service Workerin välimuistitallennusominaisuuksien ollessa käytössä ja ilman. PWA-sovelluksen luomista ja käyttöönottoa tarkastellaan olemassa olevassa yksityisessä asiakasprojektissa. Projektin tarkastelussa kiinnitetään huomiota PWA-sovelluksen tuomiin etuihin ja kipupisteisiin. Tuloksen arvioimiseksi otetaan Google Chromen Lighthouse -työkalua käyttäen mittaukset sovelluksen progressiivisuudesta ja nopeudesta. Lisäksi sovellusta vasten ajetaan Puppeteer-kirjastoa hyödyntäen latausnopeuden laskeva testi useita kertoja sekä tarkastellaan PWA-sovelluksen Service Workerin välimuistin hyödyllisyyttä suorituskyvyn ja latausajan kannalta. Jotta Service Workerin välimuistin käytöstä voidaan tehdä johtopäätökset, nopeuden muutosta tarkastellaan progressiivisten ominaisuuksien ollessa käytössä ja niiden ollessa pois päältä. Lisäksi tarkastellaan Googlen tapaustutkimuksen kautta Service Workerin vaikutuksia sovelluksen nopeuteen. Testitulokset osoittavat että Service Workerin välimuistin hyödyntäminen on nopeampaa kaikissa tapauksissa. Service Workerin välimuisti on nopeampi kuin selaimen oma välimuisti. Service Worker voi myös olla pysähtynyt ja odotustilassa käyttäjän selaimessa. Silti Service Workerin aktivoiminen ja välimuistin käyttäminen on nopeampaa kuin selaimen välimuistista tai suoraan verkosta lataaminen.
  • Hou, Jue (2019)
    Named entity recognition is a challenging task in the field of NLP. As other machine learning problems, it requires a large amount of data for training a workable model. It is still a problem for languages such as Finnish due to the lack of data in linguistic resources. In this thesis, I propose an approach to automatic annotation in Finnish with limited linguistic rules and data of resource-rich language, English, as reference. Training with BiLSTM-CRF model, the preliminary result shows that automatic annotation can produce annotated instances with high accuracy and the model can achieve good performance for Finnish. In addition to automatic annotation and NER model training, to show the actual application of my Finnish NER model, two related experiments are conducted and discussed at the end of my thesis.
  • Kämäräinen, Matti (2013)
    The 2-meter temperature output (daily mean, minimum and maximum values) of a six-member regional climate model ensemble and the corresponding observations for three stations in Finland (Helsinki, Jyväskylä and Sodankylä) are used to produce future temperature projections. Both the observed (‘delta change’ -approach) and the model scenario ('bias correction' - approach) data series are statistically corrected with several different methods. These methods make use of the statistics of temperature between the 30-year periods of observations, model control and model scenario simulations, and vary from simple (adjusting of mean) to complex (quantile mapping). Each month is processed separately. The main projection experiments are I) from 1951-1980 to 1981-2010 and II) from 1981-2010 to 2011-2040, 2041-2070 and 2069-2098. The method-dependent and to a lesser extent the model-dependent results are evaluated by means of root mean square error, mean error (mean bias), the location of quantile points, the number of daily frequency indices, analysis of variance and sensitivity tests. In near-term projections (e.g. from 1981-2010 to 2011-2040) the more conservative delta change methods slightly outperform the bias correction methods. In mid-term (projections to 2041-2070) and especially in far-term (projections to 2069-2098) predictions the bias correction approach is better in cross validation. The complicated shape of winter-time temperature distributions emphasizes the importance of correct handling of the biases compared to southern, less snowy areas. For that reason the detailed quantile mapping type bias correction approach produces the best results with predictions scoping to the end of the century.
  • Euren, Juhani (2019)
    Pro gradu -tutkielma toteutettiin Kesko Oyj:n toimeksiantona. Toimeksiannossa tutkimuskohteena oli monivuotinen ja kolmivaiheinen cMDM-projekti, jonka tavoitteena oli selkeyttää asiakastietojen keräämistä, käyttämistä sekä asiakastietojen säilömistä. cMDM-projektin ensimmäinen vaihe oli jo vuonna 2013. Vaihe 2:si toteutettiin vuosina 2015 - 2016 ja vaihe 3 2/2018 - 4/2019. Pro gradu -tutkielmassa tutkimus toteutettiin kertomushaastattelemalla vaihe 2:n ja vaihe 3:n projektin avainasiantuntijoita. Kertomushaastatteluiden analysoinneilla tunnistettiin haasteita liittyen testaamiseen, projektivaiheesta ylläpitovaiheeseen siirtymisessä sekä henkilökunnan pysyvyyteen ylläpidossa. Tunnistettujen haasteiden perusteella Pro Gradu -tutkielmassa esitettiin ratkaisuehdotukset vaihe 2:n haasteiden välttämiseksi
  • Popova, Vera (2023)
    A dynamically evolving nature of software development (SWD) necessitates an emphasis on lifelong learning (LLL) for professionals in this field. In today’s rapidly changing technology domain LLL has transitioned from an exceptional practice to an essential norm, recognized as a fundamental skill within the industry. This thesis investigates a current state, trends, challenges, and best practices of lifelong learning in SWD workplaces. Our research draws on a review of literature spanning the past two decades, complemented by semi-structured interviews conducted with software professionals of diverse experience levels, all employed in the capital region of Finland. Additionally, this thesis contributes to an existing classification of learning methods utilized by software professionals and to a lifelong learning framework grounded in established research. Our findings underscore the importance of fostering a culture of continuous learning in software development workplaces, offering recommendations for employers to consider. Nurturing a learning mindset and personalized, employee-centered approach to learning increases learner autonomy and commitment, seamlessly integrating learning as a natural part of work and career development. Such a workplace culture enhances employee well-being and employer’s innovativeness and competitiveness, establishing lifelong learning as a mutually beneficial solution for both of the parties involved.
  • Moilanen, Simo (2014)
    The purpose of this work is to suggest an approach for holistical improvement of software development endeavor performance. The insights are a theory for mapping the key performance drivers of a software development system holistically, including the human centric factors that are involved in knowledge work, and a method for applying the theory in practice via a monitoring Instrument with case studies. The findings support that the theory complemented with the Instrument does provide holistic insights on a software development endeavor, reveals performance impediments and allows performance improvement efforts to be concentrated on the key performance drivers, i.e. where waste causes most performance decline. However further research is required for fine-tuning the Instrument for optimal performance indications. Suggestions for future work include an Instrument result coverage analysis and research on framework for quicker and easier analysis for more practical usage. Nevertheless, organizations can utilize the theory and the Instrument on as-is basis for improving software development performance without further research. The originality and value of this Thesis is centered on challenging the application of traditional management with its methods in software development endeavors and suggesting a new method for gaining higher production performance.