Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Title

Sort by: Order: Results:

  • Ollinkangas, Joni (2022)
    The problems caused by hypromellose in sterile filtration of ophthalmic products in the pharmaceutical industry were investigated. The research project was performed at NextPharma Oy's ophthalmics manufacturing facility in Tampere during the autumn of 2020. Hypromellose is an excipient commonly used in ophthalmic products as a viscosity enhancer to prolong the contact time of the preparation on the eye surface. In the ophthalmics compounding process, hypromellose is first dispersed by slowly sprinkling it into a hot solution and thoroughly mixing, after which the solution is cooled to room temperature. During cooling, the hypromellose dissolves and gels, increasing the viscosity of the solution. Incomplete dispersion or dissolution of hypromellose during the manufacturing process can slow down the filtration rate or even clog the filter completely due to undissolved hypromellose polymer material. Hypromellose is an industrially produced cellulose derivative that often contains some amounts of unreacted cellulose and other sparingly soluble polymer particles as impurities, which can also cause problems in filtration processes. Sterile filtration is a commonly used sterilization method for ophthalmic products, in which the prepared bulk solution is filtered through a 0.1 to 0.2 µm pore size filter membrane into a sterile receiving vessel. Due to the very small pore size, sterile filters are easily clogged if the solution contains poorly dissolved material. The purpose of this work was to collect additional information on the possible causes of clogging caused by hypromellose and to determine whether the filterability of a solution containing hypromellose can be improved by optimizing the manufacturing process parameters. The design of experiments was prepared, creating a two-level full-factorial test matrix without replicates and with three centre points. Four different process parameters were used (mixing time, mixing speed, dispersion temperature, and cooling temperature). Minimum and maximum levels for the parameters were obtained in the initial tests, after which the test solutions were prepared and filtered in a randomized order according to the test matrix. The aim of the screening was to find out which parameters were affecting the filterability and what would be their optimal combination that would maximize the filtration rate and the yield of filtration. Finally, the optimized parameters were used to test different batches of hypromellose, comparing the results to previous filtration tests. Additionally, an alternative hypromellose dispersion method was tested to minimize the amount of insoluble material remained during the dispersion and cooling steps. Of the parameters tested, mixing speed was the least significant, while cooling temperature had the most effect on the filtration results. The solutions with lower cooling temperature had better filtration results, which may be due to reduced aggregation of hypromellose due to increased hydration of the polymer chains. The temperature behaviour of hypromellose solutions could be an interesting subject for further investigation. Longer mixing times and higher dispersion temperatures produced slightly better filtration results on average, but the differences were not statistically significant. Most challenging in the study was controlling the temperature and mixing of the solutions, and the retention of insoluble hypromellose material at the walls of the compounding vessel. The alternative dispersion method gave promising preliminary results, but the method still requires further testing. It would be important to also find the root cause of the filter clogging mechanism e.g., by further analysing the clogged filter membrane. The study provided additional useful information of the behaviour of hypromellose solutions in solution preparations and during sterile filtration, which has been helpful in solving production problems.
  • Berg, Anton (2021)
    China represents a digital dictatorship where digital repressive measures are effectively used to control citizens and dissidents. Continuous monitoring and analysis of data of individuals, groups and organizations is the core of China’s massive digital surveillance system. Internet and social media is under censorship, information is being manipulated and certain individuals and groups are targeted. In order to achieve all of this, China makes extensive use of modern technology such as artificial intelligence and facial recognition. One particular section of the population that has had to experience the full force and scale of digital repression, are China’s own indigenous people, the Uyghurs. Based on their research, human rights organizations such as Human Rights Watch have reported that the Chinese authorities have placed Xinjiang Uyghurs under mass arrests and detentions. According to the US State Department’s recent estimates, possibly over a million Uyghurs, ethnic Kazakhs and other Muslims are being held in internment camps. The detainees are reportedly subjected to beatings, torture, rape and even killed. The plight of the Uyghurs represents the most extensive mass imprisonment of an ethnic and religious minority since World War II. China is also systematically seeking to destroy Uyghur culture and ethnic characteristics. Mosques are destroyed, the practice of religion and the use of the mother tongue are banned, children are separated from their parents and women are forcibly sterilized. On January 2021, The US Secretary of State, Mike Pompeo, gave a declaration where he named the acts China is committing against the Uyghurs and other Muslim minorities as genocide. Many liberal democracies replied with similar statements. This master’s thesis seeks to answer three main questions. First, what is digital repression, second, how does China use modern technology for digital repression, and third, how does this repression affect the Uyghurs? In addition, I consider the ethical dimension and issues associated with digital repression. This includes the broader context of repressive algorithms, such as direct or indirect discrimination, as well as human rights issues, such as privacy and freedom. This is particularly important as we witness how the world is filled with a variety of devices that utilize artificial intelligence but also allow for a new scale of control and surveillance, and as we face the current era of digital dictatorships that do not respect human rights. The current world situation also raises a serious point of reflection, as artificial intelligence has become the subject of a new kind of competition between countries. China is for example exporting its surveillance technology and facial recognition capabilities beyond its own boarders to countries like Pakistan or Zimbabwe.
  • Ylinen, Mari; Tyni, Ville; Pihkala, Jaana; Salminen, Jukka; Sairanen, Heikki; Sarkola, Taisto (2018)
    Tavoitteet: Tutkimuksen tavoitteena oli arvioida aortan rekoarktaation ilmaantuvuutta, ajoitusta ja toimenpiteeseen liittyviä riskitekijöitä. Menetelmät: Tutkimusryhmä koostui 304 potilaasta, joilla todettiin aortan koarktaatio ilman merkittäviä sydämen liitännäisvikoja. 251 potilasta hoidettiin leikkauksella, 40 pallolaajennuksella ja 13 stentillä Helsingin Lastenklinikalla vuosina 2000-2012. Potilaiden syntymä-, toimenpide- ja uusintatoimenpidetiedot kerättiin takautuvasti potilaskertomuksista vuoteen 2014 asti (seuranta-ajan mediaani 7,9 vuotta). Iän ja sukupuolen suhteen täsmäytetty vertailu ryhmien välillä tehtiin 86 potilaan osalta (leikkaus n=43, pallolaajennus tai stentti n=43). Tulokset: Uusintatoimenpiteeseen joutui 40/251 (16 %) potilasta leikkauksen jälkeen, 9/40 (23 %) pallolaajennuksen jälkeen ja 4/13 (31 %) stenttauksen jälkeen. Mediaaniajat uusintatoimenpiteisiin kussakin ryhmässä olivat 3,4, 11,7 ja 19,5 kuukautta (p<0,05). Leikkausryhmässä kaikki uusintatoimenpiteet tehtiin lapsille, joille ensimmäinen leikkaus tehtiin ennen 12 kuukauden ikää. Uusintatoimenpiteen tarve oli yhteydessä pieneen painoon ja pieniin aortan mittoihin. Pallolaajennusryhmässä uusintatoimenpiteen tarve oli yhteydessä korkeaan toimenpiteen jälkeiseen käden ja jalan väliseen systoliseen paine-eroon. Stenttauksen jälkeen kolme neljästä uusintatoimenpiteestä oli suunniteltuja stenttauksen jälkeisiä katetrointeja. Iän ja sukupuolen suhteen täsmäytetyssä vertailussa (mediaani 5,7 vuotta, vaihteluväli 0,5-17,6) pallolaajennuksen tai stenttauksen jälkeiset verenpainegradientit olivat korkeampia (keskiarvo 10 vs. 4 mmHg, p=0,03) ja uusintatoimenpiteet yleisempiä (28 %, 95 % luottamusväli 17-43 vs. 2 %, 95 % luottamusväli 0-12) verrattuna leikattuihin. Päätelmät: Vastasyntyneiden uusintatoimenpiteet leikkauksen jälkeen olivat kohtalaisen yleisiä. Vanhemmilla lapsilla pallolaajennuksella tai stenttauksella hoidetuilla oli leikattuja korkeampi uusintatoimenpiteen riski, mikä liittyi pääosin ensimmäisen hoidon jälkeiseen jäännöskoarktaatioon.
  • Chandrasekar Rajendran, Suresh Chander (2014)
    Idli is a popular cereal-legume fermented food of Indian origin. It is steam cooked from fermented (lactic acid-yeast) batter of rice (cereal) and black gram (legume). Idli preparation process includes three major steps – soaking of rice and black gram, grounding and fermentation. The idli preparation process is laborious, as the whole procedure takes about 20 hours. Further, the fermented batter has a shelf life of 4-5 days at 4 ºC. Literature studies reveal less efforts has been taken to improve shelf life and nutritional quality of idli. The overall aim of this thesis was to improve the quality of idli batter by mild heat treatment (Objective 1) and through microbial applications (Objective 2-4). First, the fermented idli batter was mild heat (MH) treated (57, 60, 63, 66 and 70 ºC ) to reduce the high (10.5 log cfu/g) lactic acid bacteria and yeast counts for enhancing the shelf stability at refrigerated storage. MH treatment (at 70 ºC) induced the highest reduction (3.6 log cfu/g) without affecting the pasting profile of idli batter. During storage study (upto 10 days at 4 ºC) the microbial counts further decreased without change in pH.The second objective was to monitor the changes in physicochemical properties and B-vitamin (riboflavin, folate and vitamin B12) levels in idli batter fermentation on addition of starters - Lactococcus lactis N8 (SAA1) and Saccharomyces boulardii (YEA1). Fermentation profiles were recorded individually and in combination of starters. SAA1 and YEA1 were able to enhance or retain riboflavin and folate levels, but no change in vitamin B12 levels were observed during fermentation. Further, YEA1 individually and in combination with SAA1 significantly improved the idli batter volume, implying high gas production. The third objective was to produce nisin in idli batter by addition of SAA1 (nisin producer). The results highlighted SAA1 was capable of producing nisin in idli. However, the produced nisin was degraded by the activity of indigenous LAB and yeast in idli batter. The final objective of this thesis was to determine the viability of probiotic Bacillus coagulans (BAC1) spores after cooking (steaming and microwaving) and during storage (at 4 ºC) of idli batter. Microwave cooking resulted in higher reduction of BAC1 than steam cooking. However, 5.4 log cfu/g of BAC1 spores were still viable in steamed idli from the initial added amount (8.2 log cfu/g). The BAC1 spores were not stable in idli batter suggesting spore outgrowth during storage. In summary, these results present different strategies and information for future process and product developments in idli.
  • Liu, Xi Jr (2014)
    Jelly candy is produced by forming a gel with gelling agents in starch mould, followed by drying for days at about 45 °C chamber and obtaining the final products. The main ingredients of jelly candy are gelling agents, sweeteners, acidulants, coloring and flavorings. Gelling agents are the indispensable ingredients that give the gelatinous texture. Psyllium husk powder, konjac glucomannan and gellan gum are novel tested in jelly candy as gelling agents which can at the same time increase dietary fiber content of the candies. It was reported that psyllium husk powder and konjac glucomannan made jelly candy less sticky and springy. In this thesis, jelly candies were made with modified potato starch blended with psyllium husk powder, or konjac glucomannan, or gellan gum. The first aim is to study the effect of different contents of gellan gum on jelly candy properties. The second aim is to compare the effect of psyllium husk powder, konjac glucomannan and gellan gum on jelly candy properties. To achieve these goals, rheological properties of jelly candy mass was studied. Mechanical properties, water content and water activity of the jelly candy were analyzed directly after preparation. Mechanical analysis was made again after the jelly candies stored for two weeks. The raw data were analyzed by MATLAB and then computed by PLS. The results showed that, with the increasing of gellan gum content from 0% to 0.8%, hardness, adhesiveness and elasticity of jelly candies increased. At the same time, yield stress, coefficient of consistency and thixotropy of candy mass also increased. Psyllium husk powder and konjac glucomannan decreased hardness, adhesiveness and elasticity of jelly candies when comparing with gellan gum. They also decreased yield stress, coefficient of consistency and thixotropy of candy mass. Furthermore, along with storage, all the jelly candies became harder and stickier. In conclusion, increasing the content of gellan gum helped to make harder and more chewable jelly candies. The stability of gel structure was also increased and the flowing of candy mass became more difficult. When comparing with gellan gum, psyllium husk powder and konjac glucomannan contributed to softer and less chewable jelly candies. They decreased stability of gel structure and made candy mass flowing easily.
  • Zhang, Yiran (2015)
    Toffee is a hard-textured confectionery product which is made by boiling together sugar, milk, and fat to a certain temperature. The aim of this study was to determine the suitability and effect of seven kinds of dairy powders in toffee processing. The recipes based on dry dairy powder content of 8.8% (1st series), and protein content of 3.5% with constant solid content compensated by sucrose (2nd series). Dairy powders used in this research were: skim milk powder (SMP); butter milk powder (BMP); whole milk powder (WMP); lactose-free skim milk powder (LF-SMP); lactose-free whole milk powder (LF-WMP); 40% demineralized whey powder (D40); and 70% demineralized whey powder (D70). Color, mechanical properties, water content, water activity (aw), and glass transition temperature (Tg) of final toffee samples were analyzed. Mechanical properties were examined after one week storage as well. The data obtained were analyzed by MATLAB and SPSS. In both series, toffees made of LF-SMP showed darkest color, and those containing WMP had lightest color. Higher water content and aw in the 2nd series resulted in softer and more sticky toffees than in the 1st series. Increasing content of protein and lactose increased the hardness of toffee in the 1st series. In the 2nd series, sucrose crystals produced gritty texture of toffee, leading to decreasing hardness and stiffness of toffee, but increasing stickiness. After one week storage, all the toffees became harder and more sticky. Toffees made from lactose free dairy powders always showed higher tendency to flow and deformation than the others, indicating the function of lactose in stability of toffee. In conclusion, increasing content of protein and lactose increased the hardness of toffee. Increasing content of protein and reducing sugar led to darker color of toffee. Higher water content and aw decreased hardness, stiffness, and stability of toffee, but increased stickiness. Increasing the content of sucrose decreased the hardness, stiffness, and stability of toffee as well. SMP showed highest suitability among seven kinds of dairy powders in toffee manufacture. BMP and WMP were also possible to be dairy source in the recipe of toffee. Lactose-free milk powder was not a good choice to form stable toffee.
  • Sund, Marie (2016)
    Aim of study. The aim of this study was to determine how the processing of pitch cues in spoken words is affected by listeners' native language. In previous studies, listeners' have shown a better sensitivity to acoustic features that are linguistically relevant in the native language. It has also been shown that the processing of pitch information is lateralized to the left hemisphere when the information is linguistically distinctive and lateralized to the right hemisphere when it is not carrying linguistically relevant information. The processing of lexical pitch has been shown to be language specific. Pitch is lexically discriminating in Estonian, but not in Finnish. Therefore, native speakers of Estonian were hypothesized to show a better sensitivity to changes in pitch than the native speakers of Finnish. They were also hypothesized to show a lateralization to the left when processing linguistically discriminating changes in pitch. Methods. 12 native speakers of Estonian and 12 native speakers of Finnish participated in the study. Mismatch negativity (MMN) components of event-related potentials (ERP) were measured with electroencephalography (EEG). Stimuli consisted of Estonian words, which showed differences in duration and pitch. Results and conclusions. Scalp maps of neural activation suggested a larger sensitivity for small changes in pitch for the Estonian group, as well as a tendency towards lateralization of the processing of pitch cues to the left hemisphere for the Estonian group, and to the right for the Finnish group. These observations were supported by a significant interaction effect between language group, lateralization, and stimulus type. However, further pairwise comparisons were only marginally significant. Due to large variation in the Estonian group, the group was split based on geographical background information, since the use of pitch cue has been shown to vary regionally in Estonia. This analysis indicated regional variation in the processing of the pitch cue; the western Estonian group showed lateralization to the left hemisphere while processing stimuli with a small change in pitch. The findings of this study are in line with previous studies, showing that the native language affects the processing of pitch. It also suggests that the local language variety has an impact on these processes.
  • Leskelä, Mariia (2018)
    Plasmin is an enzyme which is an important factor affecting both flavor and shelf life of UHT milk. Plasmin hydrolyzes mainly β, αs1- and αs2-caseins into γ-caseins and proteose peptones. During storage, proteolysis by plasmin causes bitterness and gelation. The objective of present study was to investigate time-temperature combinations for preheat treatment of UHT milk to decrease plasmin activity thus improving the shelf life and quality of existing product. In experimental research, 14 different UHT milk samples were manufactured with 7 different preheat processes and two different types of milk: prehydrolyzed (A) and posthydrolyzed (B). Samples were analyzed for their extent of proteolysis by two different methods: SDS-PAGE and RP-HPLC. SDS-PAGE gave qualitative information on the casein degradation and RP-HPLC gave quantitative information. Also, sensory analyses were made by eight trained employees at Arla Ltd (Sipoo, Finland). There were significant differences in casein degradation of samples at 2 and 4 months. All samples had some proteolysis after 4 months of storage. The largest amount of casein degradation was found in samples 5, 7 and 11, whereas the least proteolysis found in samples 1, 2, 13 and 14. Same results came from both SDS-PAGE and RP-HPLC. The sensory evaluation showed differences between the samples regarding bitterness at 2 and 3 months. The samples which were evaluated most bitter in sensory analysis, showed also most proteolysis in other analyses (samples 5, 7 and 11). The rest of the samples were quite equal in all evaluated features. The impact of hydrolysis point of lactose was unclear in this study. The aim of this study was accomplished since optimal processing conditions were found: 1, 2, 13 and 14 and the optimization of the process was possible based on the results from this study.
  • Laamanen, Heimo (2021)
    Abstract Faculty: Faculty of Social Sciences Degree programme: Master of Philosophy Study track: Theoretical Philosophy Author: Heimo Laamanen Title: Process Reliabilism, Justification in the Context of Artificial Epistemic Agents Level: Master Month and year: 06.2021 Number of pages: 84 + 7 Keywords: Epistemology, justification, process reliabilism, artificial epistemic agent, philosophy of artificial intelligence Supervisor or supervisors: Markus Lammenranta and Jaakko Hirvelä Where deposited: Helsinki Unversity Library Additional information: Abstract: The main topic of this thesis is justification for belief in the context of AI--based intelligence software agents. This topic deals with issues belonging to the joint domain of the philosophy of artificial intelligence and epistemology. The objective of this thesis is to discuss a form of process reliabilism for the collaboration environment of human beings and intelligent software agents. The motivation of the study presented in this thesis is due to the ongoing progress of artificial intelligence, robotics, and computer science in general. This progress has already enabled to establish environments in which human beings and intelligent software agents collaborate to provide their users with various information--based services. In the future, we will not be aware of whether a service is offered by human beings, intelligent software agents, or jointly by them. Hence, there are two kinds of information agents, and this gives rise to the following key question: Can an intelligent software agent be also an epistemic agent in a similar way as a human being? In other words, can an intelligent software agent have beliefs, justified beliefs, and more importantly, can it know something? If so, then there is a clear motivation to extend epistemology to include the context of artificial epistemic agents. This, in turn, raises several new questions, such as the following: First, do artificial epistemic agents set any new requirements to epistemological concepts and theories concerning justification? And second, what would be the appropriate theory of justification in the context of artificial epistemic agents? At first, the reader is provided with necessary background information by discussing the following topics: introductions to epistemology; artificial intelligence; a collaborative environment of human beings and artificial epistemic agents; the concepts of information, proposition, belief, and truth; and scenarios with which main ideas are clarified and tested. Then, this thesis introduces a form of applied epistemology including its aim and some requirements for the theories of justification set by the development and operation of artificial epistemic agents. Finally, after setting the scene, this thesis explores process reliabilism including main objections and proposes an enhancement to process reliabilism so that it better addresses the context of artificial epistemic agents. The results are as follows: First, this thesis supports the view that an intelligent software agent can actually be an artificial epistemic agent capable of having beliefs, justified beliefs, and knowledge. Second, there is a clear motivation to extend the domain of epistemology to include artificial epistemic agents. This extension is a form of applied epistemology that has not yet been discussed much in either epistemology or artificial intelligence. Third, this thesis gives reasons for the supposition that the context of artificial epistemic agents sets new requirements to epistemological theories. And finally, this thesis gives motivations to support the idea that a form of process reliabilism called pragmatic process reliabilism could be the appropriate unified theory of justification for belief in the collaborative environment of human epistemic agents and artificial epistemic agents.
  • Vuorelma, Maria (2013)
    The objective of the study is to identify and analyse factors affecting the time from project submission to validation and project registration (lead time) of projects in the Clean Development Mechanism (CDM) pipeline. The hypothesis was that since projects have been implemented in China for the longest time and most projects have been in China, the process times would be shortest there. The first two phases of the CDM cycle, i.e. validation and registration, are considered. These are the most crucial stages for a CDM project, since a project will not start generating Certified Emissions Reductions (CERs) before it has been registered. The data used for the study is based on the CDM database collected by UNEP Risoe, which is gathered from the information found at the UNFCCC website. The cut-off date for projects considered in the thesis is the end of April 2010. Four factors influencing the lead times, that can be separated from the data, are considered in the thesis: 1) the validator of the project, 2) the queue in the pipeline at the time the project is submitted, 3) the type of a project and 4) in which region the project is implemented. Two methods were used in the thesis: an empirical study was conducted to assess the effects of the validator, project types, and regions on the lead times; and the effect of the queues in the process phases where analysed with regression analysis. The results show that the processing times in fact are much longer in general than they should be according to the rules and procedures of the CDM, which assign approximately 30 days for each phase. The study shows that queues have indeed had an effect on the lead times of the CDM. The results confirm the hypothesis: the longer the queue, the longer the lead time. The countries where most CDM projects have been developed, mainly China, India and Brazil, do surprisingly not have the shortest lead times. The lead times in China are the longest of all regions, even if only the projects entering the pipeline after 2007 are considered. It seems that other factors weigh more heavily in China’s favor for project implementation. It might be that since CDM projects have been implemented there for the longest time, the project developers are familiar with the process in China and prefer it for that reason.
  • Wilhelmsson, Fanny (2021)
    I och med att affärsmarknaden ständigt ändras och oförutsedda händelser i samhället såsom den rådande koronapandemin uppkommer har det medfört en avsevärd inverkan på såväl uppkomsten som hanteringen av kommersiella tvister. Det finns en klar efterfrågan på kostnadseffektiva lösningar – olika slags alternativa tvistlösningsmetoder såsom medling. Till följd av detta har användningen av olika slags processuella avtal såsom medlingsklausuler i kommersiella avtal blivit allt vanligare. I och med att processuella avtalen är ett nytt fenomen kvarstår frågan ifall sådana avtal överhuvudtaget är bindande. Att processuella avtal kan vara civilrättsligt bindande bidrar inte till att sådana överenskommelser är processrättsligt bindande och vice versa. Det finns ingen reglering avseende medlingsklausulers processuella rättsverkan i finsk lagstiftning. Syftet med avhandlingen är att utreda medlingsklausulers processuella rättsverkan – kan de överhuvudtaget anses bindande i förhållande till rättsskyddprincipen, rätten till domstolsprövning enligt EKMR artikel 6 (1) och statens rättsskipningsuppgift. Vidare behandlas kriterier som kan rättfärdiga medlingsklausulers bindande verkan. I stora drag kan det konstateras att förhållningssättet till processuella avtal såsom medlingsklausulers bindande verkan är varierande men om något mera restriktiv. Argumenten mot medlingsklausulers processuella verkan hänför sig främst till att rätten till domstolsprövning utgör en grundläggande rättighet från vilken avvikelser ska vara uttryckligen föreskrivna i lag. Argument som å andra sidan talar för processuella avtals bindande verkan framhäver antagandet av olika slags förutsättningar som att avtalsformuleringen ska vara tillräckligt preciserad och möjligheten att sådana överenskommelser kan utgöra ett tillfälligt processhinder som enbart har en suspensiv inverkan på rätten till domstolsprövning
  • Korchinskaia, Anastasiia (2022)
    В работе представлено описание теоретических предпосылок исследования взаимодействия (прочтения и создания) с текстами новой природы в процессе овладения русским языком. Рассмотрены особенности обучения русскому языку в Великом Княжестве Финляндском, приведено исследование некоторых учебников русского языка для финских школ периода инспекторской работы Якова Карловича Грота в библиотеке Славика Финской национальной библиотеки, рассмотрены некоторые современные финские учебники на предмет присутствия в них текстов новой природы. Был описан феномен текстов новой природы, а также российский опыт освоения и создания текстов новой природы при обучении русскому языку в рамках программы «Литературное творчество» в Образовательном центре «Сириус». Также был исследован и описан опыт работы с применением технологии Livekuvitus на мастерской для международных студентов в Финляндии. Взаимодействие с текстами новой природы в обучении русскому языку используется как в России, так и в Финляндии, и уже в 19-м веке в Финляндии появляются первые учебники с использованием иллюстраций. Явление текстов новой природы является интересным предметом для изучения, так как их применение могло бы быть полезно в обучении русскому языку, однако, существующей педагогической практики в российском образовательном пространстве всё ещё мало. В качестве примера в работе приведены продукты работ мастерских текстов новой природы Образовательного центра «Сириус» в Сочи. Ярким примером применения текстов новой природы в современной педагогической практике в Финляндии является технология Livekuvitus. Данная работа показывает, что её применение могло бы быть эффективно в работе с иностранными учащимися, так как взаимодействие с текстами новой природы при преодолении языкового барьера на примере мастерской для международных студентов университета прикладных наук «Карелия» является отличным способом их вовлечения в образовательный процесс. Взаимодействие с текстами новой природы могло бы быть эффективно в изучении русского языка в Финляндии, когда мотивация может считаться достаточно низкой, так как визуализация придаёт процессу обучения более интерактивный и лёгкий формат.
  • Timonen, Anni (2016)
    Aim. We aimed to assess complications and functional outcomes of restorative proctocolectomy (RPC) with ileoanal anastomosis (IAA) performed on children with total colonic aganlionosis (TCA) in relation to patients with ulcerative colitis (UC). Methods. Medical records on complications, stool frequency, day- and nighttime continence, enterocolitis/pouchitis and fecal calprotectin levels of HD patients who underwent RPC with IAA in a single center were compared to similarly treated patients with pediatric-onset UC. Results. Median operative age of HD patients was 1.6 months and 5.6 years at follow-up. Fourteen patients received J-pouch and two straight IAA. Stool frequency for daytime was 3.5 and nighttime 0. Total continence rate was 79%. At least two enterocolitis/pouchitis episodes occurred in 81%, while histologically verified pouch inflammation was observed in 27%. An increased value of fecal calprotectin was observed in 4. Conclusion. Outcomes were encouraging. Stool frequency and fecal continence appeared better preserved and the frequency of histological pouch inflammation and fecal calprotectin levels were lower than in UC patients.
  • Isomäki, Noora (2021)
    Carbon markets form a fundamental part of green economy, that is supposed to bring the world out of climate crisis, while maintaining economic growth and human well-being. This thesis contributes to the critical research on the green economy assumptions and draws from the political-ecological literature. It explores voluntary carbon markets with qualitative methods, through a case study of a production chain of carbon credits starting from their production in the Cardamom Mountains, Cambodia and ending in their buying and re-sale by the Compensate foundation in Finland. The focus of the analysis are the representations needed to create supply and demand for the carbon credit, and their effects. The thesis examines the complexity of commodification of carbon credits. Carbon offsets rely on highly technical auditing schemes. To produce carbon credits with forestry projects, the project developers must describe a “baseline”. The baseline describes a threat, which the relevant area is facing, and the conservation organization can tackle. I analyse how the representations of the threat make the conservation area governable and justify intervention and how they, at the same time, are unable to include the wider context, so they leave important drivers of deforestation unaddressed and instead target small-scale activities of individuals and local communities. Similarly, I show how in order to sell such carbon credits, climate change must be represented as a problem that can be solved by individual climate action of responsible consumers rather than as a systemic problem. As an effect, both the production and sale of carbon credits have a strong focus on targeting individuals at the expense of leaving broader societal structures unaddressed. This thesis highlights that the global North’s ability and moral justification to continue high-carbon lifestyles through offsets, requires people living in the global South to change their livelihoods and environments. Even if the communities in the conservation areas have some power to impact the ways the offset project operates, the level of optionality is much lower than in the global North, where the consumer is only subtly nudged to offset the distant damage they do. This approach is generally justified based on orientalist and neo-colonial discourses, according to which the people of the global South are unable to take care of their environments – and even themselves. The fact that no changes are demanded from the people of the global North and no existing power structures or practices are challenged arguably increases the desirability of the carbon markets as the major climate solution. This, however, also makes it justified to call carbon markets a non-transformative climate solution.
  • Soininvaara, Katri (2017)
    In condition-based maintenance data is collected from a machine to provide advice on frequency and location of developing faults. Statistical inference is needed to transform the data into information on the health of the machine. The ultimate goal is to minimise the machine down-time due to unexpected breakage. Predictive maintenance attempts to forecast the condition of the machine components from the observed data, and to maintain the machine just before it breaks down. The research question this thesis aims to solve is how to diagnose and predict component health based on data collected from the machine. Based on the literature, hidden Markov model is selected for further study. There is usually uncertainty relating to the parameters and structure of the model due to the complicated causal relationships in the modelling problem. Therefore the thesis concentrates in finding a suitable inference algorithm which is able to learn the model from data. Six different frequentist and Bayesian algorithms are tested with a synthetic example. A hypothesis is put forward that a hybrid genetic variational Bayesian algorithm could be used to find the best performing hidden Markov model of component health. As expected, the hybrid variational algorithm performs better than the other examined algorithms, especially when there is uncertainty relating to the model structure. However, since there typically is an imbalance between the data depicting faults and the data depicting the normal behaviour, the simulated test case shows that even the best performing variational algorithm has difficulties in identifying the correct model. This results in increased uncertainty in the health predictions. The thesis confirms that the hidden Markov model has many good qualities for modelling component health based on remote monitoring data. Due to the versatility of the model, it can be modified to account for the many details of component degradation behaviour in different machines.
  • Alku, Aleksi (2012)
    This study investigates the process of producing interactivity in a converged media environment. The study asks whether more media convergence equals more interactivity. The research object is approached through semi-structured interviews of prominent decision makers within the Finnish media. The main focus of the study are the three big ones of the traditional media, radio, television and the printing press, and their ability to adapt to the changing environment. The study develops theoretical models for the analysis of interactive features and convergence. Case-studies are formed from the interview data and they are evaluated against the models. As a result the cases are plotted and compared on a four-fold table. The cases are Radio Rock, NRJ, Big Brother, Television Chat, Olivia and Sanoma News. It is found out that the theoretical models can accurately forecast the results of the case studies. The models are also able to distinguish different aspects of both interactivity and convergence so that a case, which at a first glance seems not to be very interactive is in the end found out to receive second highest scores on the analysis. The highest scores are received by Big Brother and Sanoma News. Through the theory and the analysis of the research data it is found out that the concepts of interactivity and convergence are intimately intertwined and very hard in many cases to separate from each other. Hence the answer to the main question of this study is yes, convergence does promote interactivity and audience participation. The main theoretical background for the analysis of interactivity follows the work of Carrie Heeter, Spiro Kiousis and Sally McMillan. Heeter’s six-dimensional definition of interactivity is used as the basis for operationalizing interactivity. The actor-network theory is used as the main theoretical framework to analyze convergence. The definition and operationalization of the actor-network theory into a model of convergence follows the work of Michel Callon, Bruno Latour and especially John Law and Felix Stalder.
  • Zeinoddin, Narjes (2020)
    Endocytosis is the process responsible for internalising membrane components and as such plays a key role in the biology of this structure. Mammalian cells have evolved various endocytic strategies, but Clathrin-Mediated Endocytosis (CME) is the most common type. Since the discovery of CME, around 50 years ago, the field has built a remarkable wealth of knowledge on the core CME components. In stark contrast, our understanding on the relationship between CME and the actin cytoskeleton, which is present throughout the process, is still in its infancy. In this thesis, I show the production and characterisation of recombinant, SpyCatcher tagged transferrin (TF), a canonical CME ligand. TF was expressed in E. coli and using an optimised protocol, successfully solubilised and refolded from inclusion bodies. The protein was then labelled with a fluorophore and purified to a high level of purity. Tests in mammalian cells showed that home-made TF has the same endocytic behaviour as TF purified from human plasma. Moreover, I could show that the SpyCatcher moiety attached to our home-made TF is capable to mediate its covalent linkage to its counterpart SpyTag. The successful production, refolding and functional characterization of recombinant TF in this study is an important first step to examine the participation of the actin cytoskeleton during CME.
  • Aho, Kukka (2012)
    In multicellular organisms, complex signalling mechanisms have evolved to guide the behaviour of individual cells. Growth factors are secreted proteins that can stimulate the proliferation and/or differentiation of cells. Vascular endothelial growth factor D (VEGF-D) is a ligand for VEGF receptor 2 (VEGFR-2) and for VEGFR-3, which are predominantly expressed on blood vascular endothelial cells and on lymphatic endothelial cells, respectively. Thus VEGF-D can contribute to growth of both blood vessels (angiogenesis) and lymphatic vessels (lymphangiogenesis). Although there have been many reports showing the angiogenic and lymphangiogenic effects of VEGF-D, its physiological role is still largely unknown. Most of these reports are severely hampered by incomplete characterization of the specific form of VEGF-D that was used. During or after secretion, VEGF-D undergoes complicated proteolytic processing. Alternative Nterminal cleavage results in two different fully processed forms, VEGF-D major and VEGF-D minor. Processing significantly increases the activity of VEGF-D towards its receptors. Surprisingly, it is still unknown whether the differential N-terminal cleavage of VEGF-D has any effect on receptor binding activity or on receptor activation. The goal of this study was to produce and purify high quality biologically active VEGF-D which is needed for studying the physiological role of this growth factor. Several different forms of recombinant human VEGF-D were produced using the Drosophila Schneider 2 insect cell system. A bioassay utilizing the Ba/F3 cells expressing chimeric VEGFR/EpoR receptors was used to determine the receptor binding activities of recombinant VEGF-Ds. Two constructs producing biologically active VEGF-Ds were chosen for chromatographic purification (untagged major and his-tagged major forms). During purification, the activity of both VEGF-D forms towards their receptors decreased significantly. In case of the untagged form, this was presumably due to some residual proteolytic activity during purifications. The results might indicate that only the major form is responsible for the activation of VEGFR-3. The fact that no activity of the minor forms was detected when screening the cell supernatants with Ba/F3-VEGFR-3-EpoR-bioassay, supports this explanation. If this explanation can be verified, the role of the alternative N-terminal cleavage becomes obvious: By proteolysis the activity of VEGF-D can be redirected from the lymphatics towards the blood vessels.
  • Lampuoti, Jarkko (2021)
    Scandium-44 is a medically interesting positron and gamma emitting radionuclide with possible applications in molecular imaging. It is commonly produced with the use of a cyclotron in a calcium or sometimes a titanium based irradiation target. As the radiopharmaceutical use of scandium radionuclides commonly requires chelation, scandium needs to be separated from the target matrix. This is most often carried out either via extraction chromatography using a suitable solid phase or through precipitation-filtration. In this work, scandium-44 along with other scandium radionuclides was produced using cyclotron irradiation with 10 MeV protons and a solid, natural isotopic abundance calcium carbonate or calcium metal target. Scandium was separated from the irradiated targets using four different chromatographic materials and a precipitation method. Scandium-44 was produced in kilo- and megabecquerel amounts with an average saturation yield of 47 MBq/μA. The achieved separation yields in a single elution ranged from 28 ± 11 % to 70 ± 20 % with the best performing extraction material being UTEVA resin.
  • Penttinen, Miro (2023)
    The camera recognises the face, the bank card connects to the payment terminal, and the database aggregates the consumer profile. Digital and cybernetic machines change society, but they also change the production premises. For a code to connect with another code, the unclear must become clear and the indefinite definable. The trend, however, is not recent: for instance, a bureaucrat has demanded to fill out forms for a long time. Likewise, language has always required syntax. Such productions demand a component, and it increasingly determines the terms of the overall production. I examine the social, affective, and ecological effects of such production premises (definability, reliability, predictability), and I assert that their unifying factor is a crisis of creativity. My essay examines the possibility of creativity in a society produced under componential logic. I address this issue by applying Franco ‘Bifo’ Berardi's dichotomy of connective and conjunctive concatenations. Connection refers to the definable, repeatable, and predictable (i.e., componential) production. Conjunction, in turn, refers to the production of unrepeatable, ambiguous and open-ended qualities. I assert that the crisis of creativity unwraps when the poetic openness gets closed, contradictions resolved, and the undefinables defined. In other words, when connection overtakes conjunction. In the increasingly connective society, general production turns repeatable and predictable, and poetic flights and qualitative mutations become rare. Interestingly enough, qualitative mutations are a prerequisite for capitalism, as capitalism must constantly expand on new territories. It needs to establish new markets, as Rosa Luxembourg has theorised, and to capture decoded desire, an argument known from Gilles Deleuze and Félix Guattari. Therefore, a paradox determines the social system: on the one hand, capitalism demands qualitative mutations for its expansion, but on the other hand, the componentised production slows the creative production down. I argue that by examining this contradiction, we can understand some of the most central pathologies of modern capitalism, such as burning out, depression and concentration disorders. Namely, modern capitalist culture has produced the spectacle to substitute qualitative mutations with a large amount of quickly consumable ephemeral production.