Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Subject "automation"

Sort by: Order: Results:

  • Karppinen, Jutta (2017)
    In vitro liver cell models are important systems to study for example hepatotoxicity, which is one of the major causes for safety-related failures of drug candidates. 2D cell culture-based tests for compound screening are standard procedures in drug discovery, but reliable data for in vivo studies is hard to obtain because cells in a monolayer are in unnatural microenvironment. In turn, cells in 3D culture systems have more natural interactions with other cells and extracellular matrix, and their responses to drugs resemble more in vivo responses. In drug discovery and development, automation of the cell culture processes and compound screening saves time and costs, and improves the consistency and sterility of the procedures. As 3D cell culture systems are becoming more compatible with automation, they are also more promising to be used in drug discovery and development. The aim of the study was to develop and optimize automated processes for preparing 3D cell cultures into 96-well plates. HepG2, a human liver cancer cell line, cultures in nanofibrillar cellulose were prepared into well plates manually or by using automated liquid handling system. To our knowledge, this was the first time that automated processes for cell seeding into NFC were used to prepare 3D cell cultures. Cell seeding steps that could be automated were identified and optimized based on visual analysis of the wells and viability of the cells after seeding. After optimization, manual and automated processes were compared by studying cell viability, morphology and functionality. Alamar blue assay, Live/Dead assay and fluorescence-activated cell sorting were used to study cell viability, and F-actin staining, differential interference contrast microscopy and light microscopy were used to investigate cell morphology. Cell functionality was analyzed by studying albumin secretion. Cells seeded by using automation secreted normal amounts of liver-specific albumin. Cells maintained viability, morphology and functionality for four days after seeding although the results of viability varied. Alamar blue assays showed decreased development of viability although viability of manually seeded cells increased, but in other experiments the results from cultures seeded manually or by using automation were more similar. For example, lower viscosity of nanofibrillar cellulose and longer waiting time of cells at room temperature before automated processes are possible explanations for differences, as well as the natural variability in cell studies. In the future, automated high-throughput screening of compounds could be performed in 3D cell cultures prepared by using automation. That would save time and costs, and increase the correlation between in vitro and in vivo studies.
  • Pietarinen, Julius (2023)
    Soil compaction has a major effect on the soil fertility. It is important to find the compacted regions in the field for preventive actions. With a soil penetrometer, it is possible to find compactions from the soil. Hand-operated penetrom- eters are physically demanding to operate. By automating the measurement with a machine, the stress can be removed from the user, acquire higher amounts of data and achieve more accurate and consistent data, in comparison to manual measurement techniques. Machine automation achieves constant penetration speed and higher forces than manual op- eration while being not dependent on the user. The built automated penetrometer (AP) can be attached to an ATV or other small offroad machine such as a field robot. The automated penetrometer was built using a S-force sensor and ballscrew operated by a stepper motor. The force sensor is mounted on a sled running on linear guiderails. The system control is done with an Arduino-microcontroller and data processing with a RasberryPi – minicomputer. Surface moisture is measured with the Meter Teros 10 – capac- itive soil moisture sensor during the operation. Initial tests were done in the Viikki research farm, Helsinki, Finland on a perennial grass ley field. The same measure- ments were done from the same areas with proven commercial Eijelkamp soil penetrologger. AP was found to be prac- tical and useful measurement device. With the material costs of 2000 euros, it is significantly cheaper compared to the commercial device. AP data had smaller variance compared to Eijelkamp. Open-source code makes easy to do modifi- cations and changes on the machine. Integrated computer makes real time data collection and processing easy. The developed measurement device will be later implemented to an automated field robot use.
  • Pietarinen, Julius (2023)
    Soil compaction has a major effect on the soil fertility. It is important to find the compacted regions in the field for preventive actions. With a soil penetrometer, it is possible to find compactions from the soil. Hand-operated penetrom- eters are physically demanding to operate. By automating the measurement with a machine, the stress can be removed from the user, acquire higher amounts of data and achieve more accurate and consistent data, in comparison to manual measurement techniques. Machine automation achieves constant penetration speed and higher forces than manual op- eration while being not dependent on the user. The built automated penetrometer (AP) can be attached to an ATV or other small offroad machine such as a field robot. The automated penetrometer was built using a S-force sensor and ballscrew operated by a stepper motor. The force sensor is mounted on a sled running on linear guiderails. The system control is done with an Arduino-microcontroller and data processing with a RasberryPi – minicomputer. Surface moisture is measured with the Meter Teros 10 – capac- itive soil moisture sensor during the operation. Initial tests were done in the Viikki research farm, Helsinki, Finland on a perennial grass ley field. The same measure- ments were done from the same areas with proven commercial Eijelkamp soil penetrologger. AP was found to be prac- tical and useful measurement device. With the material costs of 2000 euros, it is significantly cheaper compared to the commercial device. AP data had smaller variance compared to Eijelkamp. Open-source code makes easy to do modifi- cations and changes on the machine. Integrated computer makes real time data collection and processing easy. The developed measurement device will be later implemented to an automated field robot use.
  • Higuera Ornelas, Adriana (2022)
    AI-driven innovation offers numerous possibilities for the public sector. The potential of digital advancements is already palpable within the tax administrations. Automation is efficiently used for tax assessments, to perform compliance management, to enhance revenue collection and to provide services to taxpayers. A digital transformation encompassing Big Data, advanced analytics and ADM systems promises significant benefits and efficiencies for the tax administrations. It is essential that public organizations meet the necessary legal framework and safeguards to expand the use of these automated systems since its sources of information, technical capacity, and extent of application have evolved. Using Finland as a case study, this research assesses the use of automated decision-making systems within the public sector. Constitutional and administrative legal principles serve as guidelines and constraints for the administrative activity and decision-making. This study examines the lawfulness of the deployment of ADM systems in the field of taxation by looking its compatibility with long-standing legal principles. Focus if given to the principles of the rule of law, due process, good administration, access to information, official accountability, confidentiality, and privacy. Numerous public concerns have been raised regarding the use of ADM systems in the public sector. Scholars, academics and journalists have justifiably pointed out the risks and limitations of ADM systems. Despite the legal challenges posed by automation, this research suggests that ADM systems used to pursue administrative objectives can fit with long-standing legal principles with appropriate regulation, design and human capacity.
  • Higuera Ornelas, Adriana (2022)
    AI-driven innovation offers numerous possibilities for the public sector. The potential of digital advancements is already palpable within the tax administrations. Automation is efficiently used for tax assessments, to perform compliance management, to enhance revenue collection and to provide services to taxpayers. A digital transformation encompassing Big Data, advanced analytics and ADM systems promises significant benefits and efficiencies for the tax administrations. It is essential that public organizations meet the necessary legal framework and safeguards to expand the use of these automated systems since its sources of information, technical capacity, and extent of application have evolved. Using Finland as a case study, this research assesses the use of automated decision-making systems within the public sector. Constitutional and administrative legal principles serve as guidelines and constraints for the administrative activity and decision-making. This study examines the lawfulness of the deployment of ADM systems in the field of taxation by looking its compatibility with long-standing legal principles. Focus if given to the principles of the rule of law, due process, good administration, access to information, official accountability, confidentiality, and privacy. Numerous public concerns have been raised regarding the use of ADM systems in the public sector. Scholars, academics and journalists have justifiably pointed out the risks and limitations of ADM systems. Despite the legal challenges posed by automation, this research suggests that ADM systems used to pursue administrative objectives can fit with long-standing legal principles with appropriate regulation, design and human capacity.
  • Pylkkö, Tuomas (2013)
    It is well known that the central nervous system is a highly isolated tissue. Because of this the physico-chemical criteria to be met by an orally administered central nervous system drug are very strict. This work describes methods that can be used to select drug candidates and screening collections that have a higher possibility of being relevant to central nervous system drug development projects. This work also argues that small molecular space is so vast that it is difficult to imagine any progress without focusing screening collections in some way or another. Given that most available commercial compounds are very similar in some respects, it is very much possible that this presents a bottle-neck for the progress of drug development as a whole. Therefore, research on novel methods for compound production are also evaluated. In addition, this work describes the miniaturization and automation of a previously published ELISA-based assay. This assay measures the activation of a tyrosine kinase receptor (TrkB), expressed in a fibroblast cell line. The receptor, and it's endogenous ligand, Brain-derived neurotrophic factor, have been linked to the mechanism of action of previously discovered medical interventions used in the treatment of depression. Such an assay can be used to discover either small molecule agonists or antagonists acting upon the receptor. These molecules could possibly be clinically relevant in the treatment of depressive disorders and anxiety. It is demonstrated that it is indeed possible to miniaturize and automate the method, making it significantly more suitable for high-throughput screening. The original method was carried out in 24-well plates, transferring the samples to another plate for measurement. The new design uses 96-well plates and performs the entire process on the same plate.
  • Mäkelä, Mikko (2020)
    Ultrasonic transducers convert electric energy into mechanical energy at ultrasonic frequencies. High-power ultrasound is widely used in the industry and in laboratories e.g. in cleaning, sonochemistry and welding solutions. To be effective in these cases, a piezoelectric transducer must deliver maximal power to the medium. Most of these systems rely on having the power delivery maximized during long driving sequences where stable performance is critical. Power ultrasonic transducers are typically narrowband, featuring high Q-value, that are finely tuned to a specific resonance frequency. The resonance frequency can vary during driving due to temperature, mechanical loading and nonlinear effects. When the transducers resonance frequency changes, drastic changes in its impedance (resonance to anti-resonance) can lead quickly to damage or failure of the driving electronics or the transducers themselves. In this work we developed a multi-channel high-power ultrasonic system with a software-based resonance frequency tracking and driving frequency control. The implementation features a feedback loop to maximize power delivery during long driving sequences in an ultrasonic cleaning vessel. The achieved total real power increased from 6.5 kW to almost 10 kW in peak with our feedback loop. The feedback loop also protected the electronics and transducers from breaking due to heating and varying impedance.
  • Lintuluoto, Adelina Eleonora (2021)
    At the Compact Muon Solenoid (CMS) experiment at CERN (European Organization for Nuclear Research), the building blocks of the Universe are investigated by analysing the observed final-state particles resulting from high-energy proton-proton collisions. However, direct detection of final-state quarks and gluons is not possible due to a phenomenon known as colour confinement. Instead, event properties with a close correspondence with their distributions are studied. These event properties are known as jets. Jets are central to particle physics analysis and our understanding of them, and hence of our Universe, is dependent upon our ability to accurately measure their energy. Unfortunately, current detector technology is imprecise, necessitating downstream correction of measurement discrepancies. To achieve this, the CMS experiment employs a sequential multi-step jet calibration process. The process is performed several times per year, and more often during periods of data collection. Automating the jet calibration would increase the efficiency of the CMS experiment. By automating the code execution, the workflow could be performed independently of the analyst. This in turn, would speed up the analysis and reduce analyst workload. In addition, automation facilitates higher levels of reproducibility. In this thesis, a novel method for automating the derivation of jet energy corrections from simulation is presented. To achieve automation, the methodology utilises declarative programming. The analyst is simply required to express what should be executed, and no longer needs to determine how to execute it. To successfully automate the computation of jet energy corrections, it is necessary to capture detailed information concerning both the computational steps and the computational environment. The former is achieved with a computational workflow, and the latter using container technology. This allows a portable and scalable workflow to be achieved, which is easy to maintain and compare to previous runs. The results of this thesis strongly suggest that capturing complex experimental particle physics analyses with declarative workflow languages is both achievable and advantageous. The productivity of the analyst was improved, and reproducibility facilitated. However, the method is not without its challenges. Declarative programming requires the analyst to think differently about the problem at hand. As a result there are some sociological challenges to methodological uptake. However, once the extensive benefits are understood, we anticipate widespread adoption of this approach.
  • Ekholm, Malin (2020)
    Algorithms are effective data processing programs, which are being applied in an increasing amount of contexts and areas of our lives. One such context is that of our working lives, where algorithms are being adapted to take over tasks previously performed by human workers. This has sparked the discussion about capabilities and agency of algorithmic technology, and also whether or not technology will be replacing the human workforce. Public discussion has actively taken part in constructing both opportunities and fears related to algorithmic technology, but very little research exists about the impact of algorithmic technology at work. A lot of discussion has also centered around the agency of algorithms, as due to the advances in technology, agency is no longer something only only assigned to, or possessed by human actors. While some research has been done on the construction of algorithm agency, very little research has been conducted to explore the phenomena in the context of work. Research about adapting algorithms in companies is very scarce, and the gap in this research is especially crucial due to its lack of research from a social scientific perspective. The purpose of this thesis is to investigate how algorithmic agency (or lack thereof) is constructed in the discourse of five employees of an IT company that has applied an algorithm in their operations. I further want to investigate what consequences these constructs have on the work of the employees and the flow of agency in the company. The theoretical and methodological framework is rooted in social constructionism and discursive psychology and the analysis focuses on the construction of accounts of agency in the context. In order to answer the research questions I have conducted a semi-structured focused interview with each of the recruited employees. The results show that algorithmic agency is constructed in multifaceted ways and several constructs of agency coexist in the discourse of the employees. The agency is constructed as an independent actor with agency, but that this agency is also restricted by its human developers and operational staff intervening in its decisions. While accounts for algorithmicx agency exist, agency is also constructed as something possessed by the developers and company, who develop the algorithm in order to reach certain goals. The results also show that the algorithm is constructed as an enabler and restrictor to human agency, but that the adaptation of the algorithm has also created new flows of agency, where agency flows from human to algorithm and vice versa. This thesis contributes to previous research on agency, algorithms and work by taking a contemporary, employee-centric perspective on agency, not yet taken by previous research. In order to take into account the dynamic processes of agency when adapting algorithmic technology in companies, an extensive social scientific perspective is needed to inform organizational change. In order to achieve this, more qualitative research is needed to further understand the impact of automation on agency and other interpersonal dynamics.
  • Ekholm, Malin (2020)
    Algorithms are effective data processing programs, which are being applied in an increasing amount of contexts and areas of our lives. One such context is that of our working lives, where algorithms are being adapted to take over tasks previously performed by human workers. This has sparked the discussion about capabilities and agency of algorithmic technology, and also whether or not technology will be replacing the human workforce. Public discussion has actively taken part in constructing both opportunities and fears related to algorithmic technology, but very little research exists about the impact of algorithmic technology at work. A lot of discussion has also centered around the agency of algorithms, as due to the advances in technology, agency is no longer something only only assigned to, or possessed by human actors. While some research has been done on the construction of algorithm agency, very little research has been conducted to explore the phenomena in the context of work. Research about adapting algorithms in companies is very scarce, and the gap in this research is especially crucial due to its lack of research from a social scientific perspective. The purpose of this thesis is to investigate how algorithmic agency (or lack thereof) is constructed in the discourse of five employees of an IT company that has applied an algorithm in their operations. I further want to investigate what consequences these constructs have on the work of the employees and the flow of agency in the company. The theoretical and methodological framework is rooted in social constructionism and discursive psychology and the analysis focuses on the construction of accounts of agency in the context. In order to answer the research questions I have conducted a semi-structured focused interview with each of the recruited employees. The results show that algorithmic agency is constructed in multifaceted ways and several constructs of agency coexist in the discourse of the employees. The agency is constructed as an independent actor with agency, but that this agency is also restricted by its human developers and operational staff intervening in its decisions. While accounts for algorithmicx agency exist, agency is also constructed as something possessed by the developers and company, who develop the algorithm in order to reach certain goals. The results also show that the algorithm is constructed as an enabler and restrictor to human agency, but that the adaptation of the algorithm has also created new flows of agency, where agency flows from human to algorithm and vice versa. This thesis contributes to previous research on agency, algorithms and work by taking a contemporary, employee-centric perspective on agency, not yet taken by previous research. In order to take into account the dynamic processes of agency when adapting algorithmic technology in companies, an extensive social scientific perspective is needed to inform organizational change. In order to achieve this, more qualitative research is needed to further understand the impact of automation on agency and other interpersonal dynamics.