Forestcluster EffTech programme report

148
Intelligent and Resource-Efficient Production Technologies Programme Report 2008–2010
  • date post

    14-Sep-2014
  • Category

    Technology

  • view

    3.005
  • download

    4

description

Programme report

Transcript of Forestcluster EffTech programme report

Page 1: Forestcluster EffTech programme report

Intelligent and Resource-Efficient Production Technologies

Programme Report 2008–2010www.forestcluster.fi

Page 2: Forestcluster EffTech programme report
Page 3: Forestcluster EffTech programme report

“Intelligent and Resource-Efficient

Production Technologies” (EffTech) Programme

Programme Report 2008-2010

Antti AsikainenJari HynynenTeemu Teeri

Tapani VuorinenMarjo Määttänen

Risto RitalaHeikki Kälviäinen

Lasse LensuErkki Hellén

Juha LipponenJanne Poranen

Pauliina Tukiainen

Page 4: Forestcluster EffTech programme report

4

Content

Foreword

Introduction

Less is more – intelligent and economical wood supply

New value chains of Finnish forest industry utilizing domestic wood

Functional genomics of wood formation Towards knowledge-based breeding of wood

Virtual pulp bleaching (VIP)

Short pulping

New process design methodology to reduce capital employed and to improve flexibility (POJo)

Image-based measurement methods for quality in pulping and papermaking (QVision)

Re-engineering paper (REP)

Drainage and web formation

Future paper and board making technologies (TuPaKat)

5

6

12

26

40

52

66

76

96

114

130

138

Copyright Forestcluster Ltd 2011. All rights reserved.This publication includes materials protected under copyright law, the copyright for which is held by Forestcluster Ltd or a third party. The materials appearing in publications may not be used for commercial purposes. The contents of publications are the opinion of the writers and do not represent the official position of Forestcluster Ltd.Forestcluster Ltd bears no responsibility for any possible damages arising from their use. The original source must be mentioned when quoting from the materials.

ISBN 978-952-92-9269-1 (paperback)ISBN 978-952-92-9270-7 (PDF)

Page 5: Forestcluster EffTech programme report

5

Foreword

Foreword

Introduction

Less is more – intelligent and economical wood supply

New value chains of Finnish forest industry utilizing domestic wood

Functional genomics of wood formation Towards knowledge-based breeding of wood

Virtual pulp bleaching (VIP)

Short pulping

New process design methodology to reduce capital employed and to improve flexibility (POJo)

Image-based measurement methods for quality in pulping and papermaking (QVision)

Re-engineering paper (REP)

Drainage and web formation

Future paper and board making technologies (TuPaKat)

The forest industry has experienced one of the biggest structural market changes in its history in the past decade, and nowhere more so than in Finland. A number of the forest sector’s key paper products have reached a stage of permanent decline in their life cycle, and the paper sector is facing strong pressure to both come up with new products and applications and to improve the cost and capital efficiency of its existing products and assets.

These external pressures have widened and opened the industry’s research strategy, research processes, and resource structures. One of the key structural shifts has been the creation of Forestcluster Ltd – the Strategic Centre for Science, Technology and Innovation for the Finnish forest cluster. The preparation and implementation of research programmes within Forestcluster Ltd has focused on the entire value chain. The EffTech programme is a prime example of this approach as it was drawn up jointly by players representing the whole value chain, from forest to printing house. This has had a refreshing and revitalising influence on the programme’s content and on the work of the research community.

EffTech was the first programme launched by Forestcluster Ltd. The goal of the EffTech programme was to produce knowledge for radically new solutions. This sets the bar high regarding the usability of research results, yet the rewards of success are all the richer for it. The programme received strong support from all Forestcluster shareholders, including funding. In addition, Tekes the Finnish Funding Agency for Technology and Innovation is a major funding party of the programme. Tekes has al-so been active in steering and inspiring the programme participants towards setting and attaining its high goals.

The fruits of these efforts will only be materialized when the results are directly utilized by the industry or further developed in application projects. I therefore urge all parties to explore the obtained and reported results carefully and to leave no stone unturned in the search for innovative uses and applications.

Raino KauppinenStora Enso Oyj

Chairperson of Programme Management Group

Page 6: Forestcluster EffTech programme report

6

Introduction

AbstractThe Intelligent and Resource-Efficient Production Technologies (EffTech) Programme was

launched by Forestcluster Ltd (the Strategic Centre for Science, Technology and Innovation

for the Finnish forest industry cluster) in 2008. The goals of the programme are to improve

the competitiveness of the Finnish forest cluster by developing radically new energy- and re-

source-efficient production technologies and by finding ways to reduce the capital intensiveness

of the cluster. EffTech is divided across three work packages (WPs) based on focal areas of the

programme: Raw material availability, Modelling and measurements, and Processes and

processing. The programme portfolio for the first two years included ten research projects

with a total budget of EUR 11 million, as well as three consortium projects with a total budget

of EUR 8 million.

Page 7: Forestcluster EffTech programme report

7

1. Background The Finnish forest cluster published its research strategy (‘Finland – the leading forest cluster by 2030’) in October, 2006. The strategy outlined the future R&D priorities for promoting profitable de-velopment of the industry as a whole. In 2007, Forestcluster Ltd was founded with the main task of implementing the national research strategy. Forestcluster focused the cluster’s research activity in three key areas based on their strategic impact and on the potential for added value generation through cooperation between Forestcluster owner companies. The cho-sen areas were: Intelligent and resource-efficient production technologies, Future biorefinery, and Future customer solutions.

The EffTech programme was the first research programme launched by Forest-cluster. The Forestcluster owners, togeth-er with representatives from research or-ganizations, defined the research themes and key targets of the programme as well as the evaluation criteria for project pro-posals at the workshop. Based on the outcomes of the workshop, Forestcluster issued a call for research projects for the EffTech programme in spring 2008 and over 30 project proposals were submit-ted to Forestcluster as a result. The For-estcluster Research Committee prepared the structure, contents and targets of the programme, and ten projects were finally selected for inclusion in the EffTech pro-gramme portfolio.

2. Management of the programme

The first phase of the EffTech programme was led by a Management Group (MG) comprising representatives from both in-dustry and academia. The EffTech proj-ects were divided into three Work Pack-ages (WPs) and WP Managers were ap-pointed to coordinate work between the projects and WPs. The research tasks within the projects were performed un-der the leadership of Project Managers, and projects were reported to the MG by Work Package Managers. The execution of EffTech was coordinated by Programme Manager Pauliina Tukiainen of VTT.

The main tasks of the Management Group have been to supervise the prog-ress of the programme with respect to the objectives of the national forest clus-ter research strategy and the EffTech pro-gramme plan, and to assess the scien-tific progress and techno-economic fea-sibility of the results. In 2009, the MG’s main tasks included mid-term evaluation of the programme, organization of the EffTech Workshop and discussions with the shareholder companies of Forestclus-ter Ltd in order to harmonize the EffTech programme with the companies’ research strategies and to define the most impor-tant focus areas for the coming three-year period. In 2010, the MG re-focussed the EffTech programme and the prepara-tion of programme proposals for the pe-riod 2010–2013.

Page 8: Forestcluster EffTech programme report

8

The EffTech Management Group had the following members:• Raino Kauppinen, Stora Enso,

Chairman• Lars Gädda, Forestcluster Ltd• Jyrki Huovila, Metso Paper • Erkki Hellen, VTT, WP3 Manager• Jari Hynynen, Metla, WP1 Manager• Mika Hyrylä, UPM-Kymmene

(Timo Koskinen until June 2009)• Jukka Kejonen, Myllykoski• Juha Mettälä, Tamfelt• Olavi Pikka, Andritz • Ismo Reilama, Metsä-Botnia• Risto Ritala, TUT, WP2 Manager• Petri Silenius, Kemira• Kenneth Sundberg, Ciba• Pauliina Tukiainen, VTT, Programme

Manager• Mikko Ylhäisi, Tekes

3. Programme portfolio and goals

The goals of the of the first phase of the EffTech programme were to improve the competitiveness of the Finnish forest clus-ter as a whole by developing radically new energy and resource efficient production technologies and by finding means to re-duce the capital intensiveness of the clus-ter. The objective was to reinforce the Finnish forest cluster’s leading position in the field of large-scale fibre-based paper and board production technology by devel-oping more sustainable solutions. Produc-tive domestic forest resources and compet-itive wood supply are crucial to the vitality of the Finnish forest cluster. Increasing the availability and supply of high quality raw material from Finnish forests in a sustain-able and cost-efficient manner has there-fore been one of the main strategic targets of the EffTech programme.

The EffTech programme portfolio for phase one included ten research projects

divided into three work packages (WPs) based on the focal areas of the pro-gramme: Raw material availability, Mod-elling and measurements, and Processes and processing (Figure 1).

The objective of WP1 (Raw material availability) was to increase the availabil-ity and supply of high quality raw materi-al from Finnish forests. WP1 consisted of three projects, each focusing on raw ma-terial availability but addressing the prob-lem from different perspectives, applying different methods and addressing differ-ent time horizons. The Less is More proj-ect aimed at resolving wood supply issues through more efficient use of existing la-bour and machine resources. The focus of the New value chains project was on finding new end-product oriented, profi- table and environmental friendly forest industry value chains based on domestic wood supply. The Functional genomics of wood formation II project objective was to identify the key genetic determinants responsible for wood development in for-est trees with the aim of utilising them in forest tree breeding to control wood growth and wood quality.

The goal of WP2 (Modelling and mea-surements) was to achieve marked im-provements in the use of modelling and simulation in order to increase the pace of development of new process concepts. WP2 consisted of four projects. Of these, Virtual pulp bleaching (VIP) and New pro-cess design methodology to reduce capital employed and to improve flexibility (POJo) concentrated on production system model-ling. The POJo project was tasked with de-veloping an industrially applicable first ver-sion of the multiobjective and bi-level de-sign methodology and demonstrating its applicability by means of a case study. The VIP project was focused on modelling chemical pulp bleaching at the molecu-lar level using phenomenon models. QVi-sion focused on image-based measure-ment and characterization methods relat-

Page 9: Forestcluster EffTech programme report

9

ed to quality in pulping and papermaking, while the Short pulping project developed new methods for chemical pulp quality.

The goal of WP3 (Processes and pro-cessing) was to develop new resource- efficient production technologies which are profitable, support sustainability goals and enable a range of new products. The projects approached the paradigm of cur-rent papermaking from different direc-tions. The Re-engineering paper project sought new resource-efficient production technologies for sheet production using cellulose nanofibres, and developed ad-vanced modelling tools to speed up prod-uct and process development. The TuPaK-at project took a broader view by draw-ing up projected scenarios and technolo-gy roadmaps for 2030 as well as propos-als for new radical production technolo-gies for forest-based businesses. In the SUORA project, a unique new convertible papermaking research environment uti-lizing the latest papermaking technology was developed for the needs of the Finn-ish forest cluster.

4. International cooperationInternational co-operation is built into the EffTech programme and plays an im-portant role in the development of nov-el resource-efficient production technolo-gies. Research organizations are encour-aged to pursue international collabora-tion for this purpose and with the aim of strengthening the position of Finnish re-search groups in international commu-nities and opening up new co-operation opportunities. The programme has par-ticipated in cooperation with 7 countries in total (Canada, Germany, the UK, Isra-el, Sweden, Turkey and the USA). Close links with the international scientific com-munity are maintained, in particular, in the areas of functional genomics of wood formation, forestry, wood procurement, chemical pulping, multi-parameter op-timization, image analysis and nanocel-lulose research. The cooperation initiat-ed during EffTech phase one will be con-tinued in the second phase of the pro-gramme.

Figure 1. EffTech programme portfolio.

Intelligent and Resource-Efficient Production TechnologiesProgramme Portfolio

Workpackage 1Raw material availability

Workpackage 2Modelling and measurements

Workpackage 3Processes and processing

Less is more

New value chains

Functional genomics of wood formation

Virtual pulp bleaching

Short pulping

New process design concept for capital

efficiency and flexibility

Qvision

Re-engineering paper

Future paper and board making technologies

Drainage and web formation

Page 10: Forestcluster EffTech programme report

10

The EffTech programme is designed in the way that overlapping research ac-tivities with related projects are mini-mized and the synergy between other re-search activities is maximized. Many of the researchers working within the pro-gramme also contribute to other related projects, which ensures active informa-tion exchange and rapid application of re-sults. EffTech research groups have, for instance, participated in the European Community’s 7th Framework Programme projects and several COST actions.

The EffTech programme’s core re-search also supports several industry-driven projects aimed at developing in-dustrial applications. While these projects are confidential, active participation of in-dustrial partners within the programme ensures active information flow, which in turn speeds development.

5. Dissemination of results

The dissemination of the EffTech pro-gramme information is carried out using a number of different tools, the most im-portant being the Forestcluster research portal, which is accessible to EffTech pro-gramme participants (http://www.forest-clusterportal.fi/index.php/Project_Por-tal), and the Forestcluster Ltd website. Detailed project reports and publications are available through the Forestcluster portal. In addition, the programme’s re-search projects have held workshops, re-searcher training events and meetings for industry and researchers. EffTech semi-nars have also been held on the 18th De-cember 2008, 6th October 2009 and 16th December 2010. The aim of the seminars was to bring together experts from aca-demic and industrial fields and to provide a comprehensive overview of the EffTech

programme’s current research activities and results. Approximately 100–150 at-tendees participated in each seminar. The EffTech Workshop was held on 7th Octo-ber 2009. The workshop provided a cur-rent overview of the EffTech programme and generated new ideas and discussion regarding the key targets for the pro-gramme’s second phase. The outcome of the workshop provided the basis for build-ing phase two of the programme.

6. Future plans The second phase of the EffTech pro-gramme will continue as two separate yet strongly interlinked programmes: efficient networking towards novel products and processes (EffNet) and value through in-tensive and efficient fibre supply (EffFi-bre). The division into two separate pro-grammes sharpens the programme focus and enables more flexible incorporation of new participants into the programme. The EffFibre and EffNet programmes together cover the whole value chain - from for-est to print houses.

The programmes have a combined budget of EUR 26 million for 2010–2013, and involve a large number of forest clus-ter companies and leading research insti-tutes. Tekes, the Finnish Funding Agency for Technology and Innovation, provides 60 percent of the programme budget.

The EffFibre programme focuses on improving the availability and supply of high-quality raw material from Finnish forests and developing novel production technologies for chemical pulping. The tar-gets of the EffFibre programme are to in-crease the availability of wood biomass, to improve the efficiency of the wood supply value chain, and to enhance the utility value of Finnish wood. Research is aimed at generating concepts for improving the value creation potential of Finnish wood,

Page 11: Forestcluster EffTech programme report

11

a raw material with special quality charac-teristics. At the same time, the goal is to improve efficiency by increasing pulping yield and reducing the energy consump-tion and capital intensiveness of pulp pro-duction.

The goal of the EffNet programme is to improve the competitiveness of the entire forest cluster by developing radi-cally new energy- and resource-efficient production technologies and by finding means to reduce the cluster’s capital in- tensiveness. Alongside new energy- and resource-efficient production techno- logies for web products, the EffNet pro-gramme focuses on designing nanocellu-lose-based production concepts and nov-el, innovative products. The successful re-search and promising results achieved to date by the EffTech programme have cre-ated good prospects, and continuation of the programme’s research is expected to lead to commercial nanocellulose break-throughs in the near future. As a parallel goal, the EffNet programme aims to con-solidate the Finnish forest cluster’s lead position in the field of large-scale fibre-based paper and board production tech-nology through the development of more sustainable solutions.

Page 12: Forestcluster EffTech programme report

12

Less is more – intelligent and economical wood supply

Project Manager

Duration of the project

Project budget

Project partners

Finnish Forest Research Institute (Metla), Joensuu Unit

Metla, Vantaa Unit

Metla, Suonenjoki Unit

VTT Technical Research Centre of Finland

Antti Asikainen, [email protected]

1.6.2008–30.8.2010

EUR 630,000

Role of participating organization

Discrete-event simulation of wood harvesting fleet, logistical setup of harvesting systems

Terramechanics, forest machine technology for soft soils, harvesting methods for young forests

Technology and logistics of silviculture and regeneration works in forestry

Raw material quality, storage trials, processing (pulping) technology for stored material

Page 13: Forestcluster EffTech programme report

13

AbstractThe Less is more project aimed at more efficient use of existing labour and machine resources

in wood supply. Soil sensing forest machinery and new models for optimizing machine resource

allocation in forest operations on soft soils were developed. In addition, the impacts of pro-

longed wood storage on pulp quality and energy consumption of pulping were studied. The re-

sults showed that Finland’s current harvesting fleet can be used more efficiently by equipping a

proportion of machines for year-round harvesting on sensitive soils. It was also found that wood

supply can be based on larger buffer stocks and longer storage periods than at present without

endangering pulp quality or the cost efficiency of wood supply.

A soil torque based device and a soil layer scanner were developed and tested to measure

and predict the bearing capacity of soils. The torque based measurement system and the sonar

scanning system provided good estimates of overall trafficability and point load bearing capaci-

ty, respectively. Simulation studies showed that it is profitable for a forest machine entrepreneur

to invest in equipment enabling year-round soft soil harvesting in harvesting districts, where the

share of peatland forest is remarkable in Finland. Long-term storage of spruce pulpwood caused

some deterioration of pulp colour in mechanical pulping, but this could be compensated for with

appropriate bleaching treatments. In addition, the savings in transport costs were higher than

those of additional chemicals or other treatments in the pulping process.

Soil trafficability assessment can be used for more effective timing of wood harvesting on

soft soils. More efficient spot assessment of soil bearing capacity enables avoidance of deep rut

formation and machine sinkage. The pulping trials with stored wood showed that long-term buf-

fer storage can be used to even out wood supply and to lower transport costs. The quality as-

surance and processing of wood should be based on wood supply models that enable soil sensi-

tivity estimation in order to ensure cost efficient harvesting and transport and optimal end prod-

uct value.

Keywords: wood harvesting, wood quality, sensitive soils, harvesting fleet management,

TMP pulping

Page 14: Forestcluster EffTech programme report

14

1. Project backgroundStrong seasonal and market-driven fluc-tuations in wood demand as well as an in-creasing supply of wood from forests on sensitive soils are challenging wood supp-ly in Finland. To smoothen supply, wood storage is needed at different stages of the wood procurement chain as the har-vestability of forest stands and the wood demand of mills vary independently.

In addition, reduced wood imports and shortening winters call for more ef-ficient use of the existing domestic wood harvesting fleet and its manpower. The low bearing capacity of peat soils low-ers the productivity of peatland for-est harvesting and can cause signifi-cant ground damage. Forest machines can be equipped with wider tracks or ex-tra wheels for improved trafficability, al-though these investments swell oper-ational costs. Forest operations on soft soils can also be improved by more pre-cise advance estimation of soil bearing capacity. This could be done by measur-ing soil properties and using, for example, weather data to estimate the moisture and strength of soil layers. In addition, machines could measure the depth of the peat layer as they move on site. The da-ta collected could then be further pro-cessed for use in tutoring machine ope- rators in efficient machine driving.

It is commonly understood that wood needs to be as fresh as possible when used for pulping. Freshness is ensured either through rapid wood procurement operations or by special wood storage ar-rangements. Maintaining wood procure-ment resources at a level that ensures seamless year-round delivery of fresh wood from stump to mill requires exten-sive machinery investments, thus increas-ing the unit costs of all delivered wood. Special wood storage arrangements such as sprinkling or cold storage also increase wood costs. Considerably lower wood pro-curement costs can, however, be achieved

by reducing the amount of water trans-ported to mills within the wood raw mate-rial, for example by allowing wood to dry prior to long-distance road transportation.

2. Project objectivesThe long-term goal is to find means of reducing fibre costs for mills through im-proved forest operations and through bet-ter understanding of the effects of wood storage and drying on wood properties and TMP processing. The key objectives are to:• Develop methods for collecting data

on the mechanical properties of soils, particularly peatland soils

• Validate methods of assessing the impact of harvesting machinery on soils

• Identify and quantify the most cost-effective options with regard to machine modifications and operating methods when the share of peatland logging is high, taking silvicultural operations also into consideration

• Investigate the impact of different productivity factors on the loading sequence of forwarder work. A special focus area was clarification of the influence of loading point positioning for operator assisting system

• Assess the differences between fresh and over-summer, land-stored spruce logs

• Investigate the plate gap phenomena of these different wood qualities during TMP refining

• Analyse the brightness and bleachability of the studied wood qualities

• Investigate possibilities for improving brightness and bleachability through chip or TMP pulp washing.

Page 15: Forestcluster EffTech programme report

15

3. Research approachThe machine mobility study was based on an experimental approach. The test tracks were located on both mineral and peat soils and were passed over first by a 6-wheeled John Deere 1070D harvest-er and then by an 8-wheeled John Deere 1110D forwarder. The soil shear modulus was measured manually using a spiked shear vane developed by the Finnish For-est Research Institute (Metla).

A method for continuous ultrasound measurement of wheel sinkage was de-veloped and tested. The system measures the distance between the soil surface and a specified position on the vehicle.

Based on the assumption of harvest-er motion resistance being capable of pre-dicting forwarder motion resistance on the same site, a CAN bus based measure-ment of harvester motion resistance for mobility mapping was developed.

A numeric logging simulation mod-el was compiled using WITNESS process simulation software. Genuine logging sites and logging contractors were se-lected for the simulations from extensive and accurate logging history data con-tributed by major forest companies. The simulation model consisted of three har-vester-forwarder units and one low-bed truck for machine relocations. One of the three harvester-forwarder units was used for soft soil harvesting during summer. In order to increase the trafficability of the machines, the logging unit was modified with purpose built band tracks for use on soft soils in non-frozen ground conditions. Four modification classes in terms of the maximum ground pressures of the ma-chine concepts were created for the soft soil logging unit. Three contractors (A, B, C) were modelled.

A virtual forwarder simulator was cho-sen for work pattern and feedback as-sessment. Five experienced operators and five students participated in the trials. Af-ter 15 minutes of training, motor coor-

dination and conception of the 3D envi-ronment were achieved in the boom han-dling tests. Detailed studies of forwarder work were carried out in three virtual log-ging environments including one final fell-ing and two first thinning sites.

An experimental approach was used to quantify the differences between fresh and stored spruce logs, and their refin-ing and bleaching behaviour in TMP pro-duction. Both raw material types were handled in the same way: logs were de-barked, chipped and refined at a pilot scale. The material quality was tested at numbered stages (Figure 1):

The development of brightness from wood to pulp was closely investigated. The formation of coloured structures was monitored by UV-Vis reflectance mea-surements. The reflectance spectra were recorded by a Perkin Elmer Lambda 900 spectrometer equipped with an integrat-ing sphere. The entire spectral region (200–800 nm) was measured from thick Bühner sheets (pH 5.0–5.2). The chro-mophore reactions responsible for the brightness changes were investigated by difference reflectance (ΔR) spectra.

The refining conditions were varied in order to achieve different heat treatments (Figure 2):

Chip washing and pressing were car-ried out prior to the laboratory-scale re-fining in order to assess the effects of chip washing on pulp brightness and bleaching response (Figure 3). A Frex pis-ton press was used to carry out the test-ing programme using ion-exchanged wa-ter, DTPA and sulphuric acid solutions. A wing refiner located at VTT Jyväskylä was used for refining the treated chips. Sim-ilar washing treatments were performed on TMP pulp samples using a Perti tester at VTT Otaniemi to compare whether the washing treatment is more effective prior to or after refining.

Page 16: Forestcluster EffTech programme report

16

4. Results

4.1 New methods for assessing trafficability on sensitive soilsThe spiked shear vane proved capable of measuring peatland surface strength and predicting rut depth (Figure 4). Measure-ment of stony mineral soils was, however, problematic due to insufficient spike pen-etration depth.

The results of continuous ultrason-ic measurement of wheel sinkage were difficult to interpret with stony mineral soils as the position of the vehicle chas-sis affected the measurement. Reason-able measurement results were, how-ever, achieved on peatland sites. Figure 5 gives an example of harvester perfor-

mance where high rear wheel sinkage can be noted. Occasional negative read-ings were caused either by vehicle pitch or surface irregularities. Rebound of the rut bottom after the vehicle pass was con-siderable: rut depth had very little corre-lation with wheel sinkage.

The technique opened up new pos-sibilities for examining vehicle perfor-mance. The true sinkage of the vehicle wheels is presented in Figures 6 and 7. The correlation between harvester and forwarder average front and rear wheel true sinkage was 0.79, and the suit-ability of mobility mapping by harvester was thus found to be good. The studied 6-wheeled harvester was found to per-form unfavourably in peatland operations,

Figure 2. Experimental setup of refining.Figure 1. Quality testing of material in the refining process.

Figure 3. Processing steps in chip pressing and washing experiments with fresh and stored chips. Samples were taken at numbered stages: 1. condensed steam, 2. squeezed filtrate, 3. fresh impregnation liquor, 4. squeezed impregnation liquor, 5. squeezed washing water.

Logs Debarking& chipping

Chips Preheating 1st stage refining

2nd stage refining

1 2 3 4

Fresh ordry chips No

preheating

Preheating

300 kPa

300 kPa

500 kPa

Refinedto 3 SEClevels

Fresh orStored chips

1 2 3 4 5

Nosteaming

Steaming

FR

EX

Pre

ssing

FR

EX

Pre

ssing

FR

EX

Pre

ssing

Deionizedwater

H2SO4pH 1.5

0.3 % DTPA

Washing with ionized water

Wingrefiningof chips

Page 17: Forestcluster EffTech programme report

17

 

Figure 4. Rut depth vs. shear modulus for the studied mineral and peat soils.

 

Figure 5. Ultrasound sinkage and manual rut measurement results on peatland.

Page 18: Forestcluster EffTech programme report

18

since the true sinkage of its rear wheels was 3.1 times that of the front wheels, as compared to the corresponding figure of 1.7 for the forwarder. Placement of the ul-trasonic transducers was critical to avoid-ing damage and erroneous readings.

4.2 Operational efficiency in year-round CTL harvesting on sensitive soils and differences between operators in forwarder workUnit costs varied from 9.5 to 11.9 €/m³ with contractor A, from 12.6 to 15.8 €/m³ with contractor B and from 15.2 to 18.7 €/m³ with contractor C depending on the study scenario. The mean pro-ductivity of peatland loggings during the winter period was 12.0 m³/h, where-as the productivities of year-round peat-land logging with the modification class-es of “improved bearing”, “high bearing” and “extreme bearing” were 10.5, 10.7, and 10.9 m³/h, respectively. Regardless of the higher logging productivity during wintertime for peatland sites, the limited number of logging sites during the non-frozen period favoured the harvesting of some peatland sites during summertime, thus increasing logging opportunities, de-

creasing machine down-time and lowering unit costs for the whole year. In the con-tractor A scenario (peatland logging com-prising 25% of the total removal), for in-stance, in which peatland forest was har-vestable either only during winter or year round, the most economical approach was to purchase soft soil equipment and mod-ify one logging unit for summertime peat-land logging (Figure 8). Depending on the modification class for peatland loggings, the decrease in unit costs was 1.2 to 3.8% for year-round logging compared to peat-land logging only during wintertime. When the share of peatland logging in-creased to 30% or more, there was a def-inite need to modify one logging unit for soft soil logging during summertime.

On general comparison, none of the modification classes clearly outperformed the others in monetary terms. Analysis of the study cases of all contractors indicat-ed that when the share of lowest bearing capacity peatland sites is high and peat-land loggings are carried out throughout the summer period, the most economical approach is to invest in achieving the “ex-treme bearing” modification class. Addi-tionally, the smaller the ground pressure of the machine, the less rutting problems

   

Figures 6 and 7. True wheel sinkage of the studied harvester and forwarder measured with the ultrasound technique.

Page 19: Forestcluster EffTech programme report

19

and work interruptions occurred, and the more the machine was capable of driving itself free after sticking in peaty soil

One significant cost saving method is to schedule an optimal cutting order for peatland sites according to their ground bearing capacity class. Sites classed with the lowest ground bearing capacity should be harvested during winter when the ground is frozen and the highest bearing site classes should be summer harvested.

According to the simulation results, use of excavator-based forest machines for peatland harvesting during autumn and winter and for seedling planting dur-ing summer was a feasible and cost-effi-cient option. Planting costs decreased sig-

nificantly if the base machine was used year round. This is an essential factor in improving cost-competitiveness in com-parison to manual planting. One of the three logging units could be dedicated to planting seedlings and extracting logging residues and stumps during the summer.

In the virtual simulator study, exten-sive variation was found between opera-tors in both the velocity and the length of trajectory of the boom tip. The most pro-ductive operator performed at the highest velocity and the shortest length of trajec-tory per average grapple load cycle. Re-spectively, the slowest operator had the second longest boom tip trajectory. Min-imal difference was found in the perpen-

Figure 8. Logging costs of contractor A per study scenario. With the modification class “basic”, peatland loggings were carried out only during winter, whereas the other modification classes enabled year-round peatland logging.

Share of peatlands,% ~10 ~10 ~10 ~10 ~25 ~25 ~25 ~25 ~40 ~40 Length of winter, mth 3.5 3.5 2 2 3.5 3.5 2 2 3.5 3.5 Summer loggings, m³ 50,500 41,300 50,500 41,300 50,500 41,300 50,500 41,300 50,500 41,300 Down time/chain, h 107 356 415 661 8-111 202-366 ~15 ~230 0 ~50 Peatland loggings, m³ 9,326 9,326 9,369 9,326 ~24,000 ~24,000 ~24,100 ~24,100 ~40,500 ~40,600 Total removal, m³ 116,180 106,320 92,900 84,010 109,000 100,000 ~103,200 ~95,350 ~105,800 ~104,000

-112,500 -106,500

Page 20: Forestcluster EffTech programme report

20

dicular distance (x-component) to the pile, whereas the difference between op-erators in the roadwise distance (y-com-ponent) to the pile was larger.

4.3 Quality control with procurement chains relying on buffer wood stocks

4.3.1 Differences between fresh and stored spruce logs and TMPpulps derived from themOver-summer land-stored spruce pulp-wood had clearly higher dry matter con-tent (72% vs. 43%) and about 4 percent-age units lower brightness than otherwise similar fresh spruce pulpwood. The UV-Vis spectra of wood pellets of the raw mate-rials (Figure 9) reveal that the content of structures absorbing at 460 and 490 nm, and to some extent also >600 nm, is high-er in dry wood. These structures tend to be formed during wood storage, whereas structures absorbing ~400 and 600 nm are formed mainly during refining.

Chromophores formed during wood storage or during pulp refining can be re-moved more efficiently from fresh wood than from dry wood during peroxide bleaching (Figure 10).

4.3.2 Plate gap phenomenaThe dryness of wood does not influ-ence thermo-mechanical pulping unless the moisture content remains above fi-bre saturation point. With the exception of optical properties, pulp properties be-tween fresh and stored wood were rel-atively similar. The similar shape of the temperature profiles at the plate gap in-dicates similar pulp flow between fresh and stored raw materials. Stored wood caused slightly higher temperature levels (Figure 11), possibly due to higher pulp consistency during refining. The lower pulp freeness level obtained for dry chips

Figure 9. Difference reflectance spectra showing the difference between fresh and dry wood chromophores and the difference between the pulps produced from these irrespective of the refining conditions

Figure 10. Difference reflectance spectra showing coloured structures remaining to a higher extent in dry pulp after peroxide bleaching (1.5, 3 and 4.5% H2O2).

Page 21: Forestcluster EffTech programme report

21

   

Figure 11. The temperature and shear forces distribution at the plate gap at first stage refining of fresh and dry chips with SEC 1.54 MWh/t.

 

Figure 12. Effect of chip washing treatments on pulp brightness. Comparison with pilot TMP pulps (TMP 13 fresh and TMP 26 dry) prepared from unwashed chips, although the pulps prepared from washed chips were prepared with a wing refiner. TMP-Q indicates the pulp brightness after chelation.

Page 22: Forestcluster EffTech programme report

22

 

Figure 13. Effect of chip pretreatment on final brightness after peroxide bleaching (3% H2O2, 2.25% NaOH, 2% silicate, 15% consistency, 70°C, 120 min).

 

Figure 14. Effect of efficient pulp washing aimed at metals removal and high final brightness after peroxide bleaching (3% H2O2, 2.25% NaOH, 2% silicate, 15% consistency, 70°C, 120 min).

   

Page 23: Forestcluster EffTech programme report

23

(391 ml vs. 459 ml) may be related to the greater shear forces measured at the out-er parts of the plate gap.

4.3.3 Chip washing

The effect of chip pretreatments on pulp brightness after refining is shown in Figure 12. Normal pulp chelation with-out chip pretreatment results in similar brightness increase levels to chip pre-treatments, especially in the case of fresh wood. After chip pretreatment, pulp che-lation has no additional effect on pulp brightness. Unexpectedly, water treat-ment is as efficient as DTPA or acid wash-ing, suggesting that the brightness in-crease is due to the washing/extraction of other wood components than metals.

The effect of chip pretreatments on pulp bleachability was evaluated by small-scale bleaching experiments (3% H2O2, 2.25% NaOH, 2% silicate, 15% consis-tency, 70°C, 120 min). Before bleach-ing, all of the pulps were chelated us-ing 0.25% DTPA at 2% consistency, pH 6, and 70° for 20 min. The final bright-ness of the bleached dry pulps remained lower despite the pretreatments (Figure 13). Chip pretreatment with water seems again to be as efficient as DTPA or ac-id pretreatment. However, after chip pre-treatments the difference between the final brightness of fresh and dry pulps seems to be less.

4.3.4 Pulp washingPulp washing experiments gave simi-lar results to the chip pretreatment ex-periments (results are not shown). Pulp chelation results in similar brightness in-creases in fresh and dry pulp than the pulp washing treatments tested. After washing, pulp chelation has no significant effect on brightness, as could be expect-ed. Again, pure water treatment is as ef-

fective as DTPA or acid washing, suggest-ing that metals do not play a significant role in this brightness increase.

After pulp washing, the bleachability results are similar to those after chip pre-treatments (Figure 14). The final bright-ness of the bleached dry pulp remains lower regardless of the pretreatments, and pulp washing with water seems to be as efficient as DTPA or acid washing. In this case, the brightness difference bet- ween bleached fresh and dry pulp re-mains similar after pulp washing. In this respect, chip washing can be more bene- ficial and may be related to lower for-mation of coloured structures during re- fining.

5. Future plans and key development needs

In silvicultural practices such as the tend-ing and clearing of young stands, a pro-ductivity increase of 15% is attainable through full mechanization. In wood har-vesting from intensively managed forests, a productivity increase of 15–20 % has been estimated to result from improved material handling, use of semi-automa-tion with operator tutoring, and higher ca-pacity utilization rate.

Finnish logging technology is inter-nationally regarded as being of the high-est level. A major bottleneck for larger market penetration by Nordic forest ma-chines, however, has been the poor avail-ability of operators sufficiently trained to operate the machines economically and efficiently. A technological leap has been achieved in the driving and manoeuvring of forest machines through the innova-tive fusion of data collected by forest ma-chines, machine perception (i.e. machine vision and laser scanners) and ground

Page 24: Forestcluster EffTech programme report

24

sensors. These data are integrated and interpreted into meaningful feedback and guidance for the machine operator, en-abling the operator to improve their per-formance to match the machine’s capabil-ities. In addition, tree mapping data mea-sured during cutting using SLAM (Simul-taneous (machine) Localization and (tree) Mapping principles is used in the updat-ing and quality monitoring of stand data-bases for future stand management pur-poses. A radical innovation is to apply the latest findings and solutions of ma-chine- and AI-assisted feedback, advising and tutoring of pilots/operators of moving machines such as military and aviation equipment to the forest operations envi-ronment. Modern forest machines provide an excellent platform for studying and de-veloping this concept. Research in this area, which began under the Less is more project, is being continued within Forest-cluster Ltd’s EffFibre programme.

6. Exploitation plan and impact of results

The project results revealed strategic de-velopment needs as well as practical so-lutions for achieving cost efficient wood supply. The studies concerning technolog-ical solutions for harvesting on soft soils are directly applicable when investments

in harvesting machinery are made. For-est machine manufacturers see high po-tential in adding intelligence to machines to support driving and operating in ways that minimise sinkage and soil damage.

The comparative findings regarding the processing of stored and dried pulp-wood could serve as a starting point for more efficient wood procurement opera-tions. The proposed efficiency enhance-ment would be based on the delivery of fibre, instead of water, to the mill. The re-sulting change in one of the key operat-ing parameters, i.e. mass of wood per cu-bic meter, would have consequences for all wood procurement operations, from harvester operations (actions to promote debarking) to all transportation vehicles (change in size of cargo space, or total mass, or demand for trucks), and wood storage operations (new storing methods, wood terminals).

Page 25: Forestcluster EffTech programme report

25

7. Publications and reportsHallongren, H., Use of track-based ex-cavators in year-round forest operations – A simulation study of mechanical for-est planting and peatland forest harvest-ing). University of Eastern Finland, Facul-ty of Science and Forestry, Master’s the-sis in Forest and wood technology, 2010 (In Finnish).

Lamminen, S., Väätäinen, K., & Asi-kainen, A. Operational efficiency of the year-round CTL-harvesting on sensitive sites in Finland – A simulation study. Pre-cision Forestry Symposium. Stellebosch. South-Africa. 1.3.2010. (presentation)

Lamminen, S., Väätäinen, K. & Asi-kainen, A. 2010. Operational efficien-cy of the year-round CTL-harvesting on sensitive sites in Finland – A simulation study In: Ackerman, P.A., Ham, H. & Lu, C. (eds) Developments in Precision For-estry since 2006. Proceedings of the In-ternational Precision Forestry Symposium, Stellenbosch, South Africa, 1–3 March 2010. Stellenbosch University, p. 18–21. (extended abstract)

Sirviö, J., Särkilahti, A., Liitiä, L., Fredrikson, A., Salminen, L.I. & Nur-minen, I., Prolonged wood storage causes mainly brightness problems for TMP. International mechanical pulping conference, June 27–29, 2011, Xi’an, Chi-na. Accepted.

Väätäinen, K., Discrete event simula-tion - an advanced method for analyzing complex logging operations In: Acker-man, P.A., Ham, H. & Lu, C. (eds.) Devel-opments in Precision Forestry since 2006. Proceedings of the International Precision Forestry Symposium, Stellenbosch, South Africa, 1–3 March 2010. Stellenbosch Uni-versity, p. 17. (extended abstract)

Väätäinen, K., Lamminen, S., Sirén, M., Ala-Ilomäki, J. and Asikainen, A., Ympärivuotisen puunkorjuun kustan-nusvaikutukset ojitetuilla turvemailla − korjuuyrittäjätason simulointitutkimus (Cost effects of year-round harvesting of drained peatland forests). Working Papers of the Finnish Forest Research Institute 184, 2010, 57 p. ISBN 978-951-40-2276-0 (PDF). (In Finnish)

Page 26: Forestcluster EffTech programme report

26

New value chains of Finnish forest industry utilizing domestic wood

Project Manager

Duration of the project

Project budget

Project partners

Finnish Forest Research Institute (Metla)

VTT Technical Research Centre of Finland

Jari Hynynen, [email protected]

1.6.2008–30.8.2010

EUR 495,000

Role of participating organization

Project coordination. Analysis of the wood supply chain (forest management, harvesting and logging, carbon sequestration): modelling, simulation and decision support systems, wood properties.

VTT was responsible for carbon footprint calculation. VTT and Metla co-developed a link between the MOTTI and KCL-ECO programs, thus providing a tool for carbon footprint analysis of the value chain.

VTT (Jyväskylä) was responsible for dense media fractionation studies.

Page 27: Forestcluster EffTech programme report

27

AbstractThe New value chains of Finnish forest industry utilizing domestic wood project aimed at finding

research-based solutions for cost efficiently and sustainably increasing and improving the pro-

duction and availability of high quality domestic wood. The project also aimed at improving the

energy and resource efficiency as well as environmental-friendliness of current and future forest

industry value chains. Intensive forest management was found to be justifiable for cost-efficient

and sustainable production of high quality raw material for the forest industry. Carbon footprint

analysis (LCA approach) of the production value chain for super-calendered (SC) paper showed

that the most important source of CO2 eq. was the production of the electrical power consumed

by integrated SC paper mills. Adopting a forest management strategy which combines pulp-

wood, timber and energy wood production, and the use of energy wood as an energy source

at the mill can notably decrease the carbon footprint of the value chain. Wood properties were

found to be affected by management practices, and have a notable effect on the resource and

energy efficiency of pulping processes.

The project provides new research-based information on the potential of alternative forest

management strategies to produce high quality wood and biomass for the forest industry. Fur-

thermore, new information is produced and methods developed for assessing the energy and

resource efficiency and environmental friendliness of the alternative value chains of the forest

cluster. The results of the carbon footprint analysis can be applied in assessing the environmen-

tal friendliness of alternative wood supply and wood processing chains, and in assessing their

effects on emission trading. The results are applicable in the decision making and planning of

new practices in different parts of forest cluster value chains.

Keywords: forest management, growth and yield, wood properties, carbon sequestration,

carbon footprint, LCA analysis, pulp and paper making processes, SC paper, dense media frac-

tionation, value chain

Page 28: Forestcluster EffTech programme report

28

1. Project background

The aim of the Finnish forest cluster is to double the value of its forest-based prod-ucts and services, and to increase the use of domestic wood by 25%. The whole value chain from forest to end products should be accomplished in a sustainable, environmentally-friendly and responsible manner. To meet this goal, the entire for-est cluster value chain needs to be de-veloped. In order to do this, all practices within the value chain must be thorough-ly analyzed and evaluated, and the most critical practices identified.

Increasing the value of forest-based products requires new methods in wood production in order to ensure the avail-ability of high quality raw material, and new methods in wood processing in order to ensure high quality end products. Fur-thermore, the energy, resource and cost efficiency of all parts of the value chain need to be improved. Life cycle and car-bon footprint analyses have proved to be applicable methods for assessing the de-gree of sustainability and environmen-tal-friendliness of activities and process-es within the value chain.

2. Project objectivesThe project addressed the energy and re-source efficiency of the forest industry’s current and future domestic raw materi-al based value chains. The current val-ue chain covers forest management for wood production, logging and transporta-tion of wood from the forest to the mill/plant, and the processing of wood into end products.

The main objectives of the project are to find research-based solutions for: • Improving the availability of

domestic wood • Increasing domestic wood

production

• Improving the quality of domestic wood as raw material for the forest industry

• Improving the energy and resource efficiency of wood processing in a cost efficient and sustainable manner.

Carbon footprint analysis was used to calculate the energy and resource ef-ficiency of value chains. Regarding wood supply, the effects of alternative wood production chains on carbon sequestra-tion were assessed, as well as the car-bon footprint of practices applied in forest management, logging and the transpor-tation of wood. In wood processing, the energy efficiency of alternative production processes was emphasized.

3. Research approachThe research approach was based on ho-listic analysis of the entire value chain. Critical parts and activities within the val-ue chain requiring improvement in order to meet the project objectives were iden-tified and emphasized.

Value chains were assessed using model-based scenario analyses. The case study approach was regarded as a via-ble method of performing the value chain analysis. Wood supply scenarios were created for a set of typical Finnish for-est types which, on one hand, represent the most important forest areas for wood production in Finland and, on the other hand, are challenging in terms of domes-tic wood supply. The forest types includ-ed in the analyses were:1. Norway spruce stands on high

productive mineral soil sites in southern Finland

2. Scots pine stands on mineral soil sites in central Finland

3. Scots pine stands on drained peatland sites in central Finland.

Page 29: Forestcluster EffTech programme report

29

The objective of the analysis was to asses the impact of management on wood supply. We therefore applied a sim-plifying assumption regarding the forest structure. We assumed that for each for-est type, the forest area constitutes a uni-form “normal forest” in which stands of all age classes from regeneration to end of rotation are represented in equal propor-tion. For this type of forest area, the an-nual wood and biomass production, car-bon sequestration level, and management operation volumes are constant.

For each case study, four alternative management scenarios were created:1. Management according to current

management recommendations for commercial forests (‘business as usual’): combined pulp and timber production

2. Extensive management: no management practices in addition to obligatory forest regeneration operations

3. Pulpwood production chain applying short rotations without commercial thinnings

4. Management adapted to climate change: combined production of pulp, timber and biomass aiming at high levels of carbon sequestration in the growing stock and soil, and production of renewable raw material and energy to replace fossil fuels.

The wood supply scenarios were ana-lyzed with respect to the quantity and quality of raw material produced, the cost and energy efficiency of wood supply, and their carbon sequestration potential. The MOTTI stand simulation software of the Finnish Forest Research Institute (Metla) was applied in this analysis augmented with the Yasso model for prediction of soil carbon dynamics.

The energy and resource efficiency of the whole value chain from forest to end

product (cradle-to-gate) was analyzed for the SC paper production chain (Figure 1). We analyzed the carbon footprint of the value chain, describing the greenhouse gases emitted throughout the life cycle of the product or system. This was ac-complished by applying LCA calculations, taking greenhouse gas emissions into ac-count.

For calculating the carbon footprint of the value chain we integrated models describing:• Development of forest stand

dynamics and the effects of forest management on stand development and wood quality

• Time and energy consumption of wood procurement (logging, storage and transportation)

• Practices and processes in the wood processing industry

In the analysis, two advanced simula-tion tools were combined and applied: the MOTTI simulator for wood supply analy-sis, and KCL-ECO for industrial processes and LCA calculations.

The results of this project are applica-ble in the development of greenhouse gas inventory methods. The most favourable production chains were identified with re-spect to their carbon footprint.

The potential for influencing wood properties through forest management methods was investigated by collecting and analyzing wood samples from long-term experimental stands with varying treatment intensity (thinning and fertil-ization trials of Metla). This data was uti-lized in statistical analysis and modelling of wood properties and their response to alternative management methods. In ad-dition, the same sample tree data were utilized in fractionation (VTT, Jyväskylä) and pulping studies (VTT, Espoo). The aim of the fractionation studies was to deter-mine the potential of dense media frac-tionation of wood prior to pulping in nar-

Page 30: Forestcluster EffTech programme report

30

Figure 1. Alternative wood supply chains from spruce stands in southern Finland (adapted from Hynynen, 2009).

rowing the variation of raw material prop-erties and in reducing the energy con-sumption and costs of fibre production.

Pulping experiments were carried out to show the effect of the wood proper-ties of spruce and pine, as influenced by growth rate, on process efficiency and pulp quality.

4. Results

4.1 Forest industry value chains Mean annual wood and biomass produc-tion was notably affected by management strategy (Figure 2). Intensive manage-ment scenarios aiming at combined pro-duction of pulpwood and timber result-ed in the highest harvestable yields. On the other hand, extensive management lead to poor saw timber production and increased natural mortality. In climate change adjusted management, recovered

energy wood comprised a notable share of total yield, resulting in the highest to-tal yields for mineral soil sites.

The net annual carbon sequestration, average carbon storage in the growing stock and soil, and emissions from man-agement practices were calculated for spruce stands, the main results of which are presented in Table 1.

Harvestable carbon content is propor-tional to harvestable yield. The average carbon storage of forest is, as expected, highest in unmanaged stands with high stocking densities. Despite their high car-bon storage, their net sequestration ca-pacity is proportional to their net biomass yield. Ultimately, the impact of alternative wood supply scenarios on the atmospher-ic CO2 balance depends on how the har-vested wood and biomass yield is utilized.

The proportion of emissions related to management practices, including sil-vicultural operations, logging and sec-ondary haulage, are marginal compared

Page 31: Forestcluster EffTech programme report

31

Figure 2. Mean annual wood and biomass yields of different forest types treated according to alternative management scenarios (preliminary results).

Page 32: Forestcluster EffTech programme report

32

to the carbon content of harvesting re-movals. Although biomass recovery in-volves fairly intensive harvesting opera-tions, emissions account for only a few percent of the CO2 content of the recov-ered biomass, which can be used to sub-stitute fossil fuels in energy production.

The carbon footprint calculations were made for typical Finnish SC paper, which was assumed to be produced in an inte-grated mill. The unit processes of the life cycle are presented in Figure 3, where the different colours represent the different life cycle stages. The other unit process-es, excluding forestry and the integrated

mill, were assumed to be the same in all other scenarios.

Data on the different forest manage-ment scenarios and operations performed was produced using the Motti software de-veloped by Metla, including data on emis-sions and fuel use. The received data in-cluded emissions of CO2, CO, HC, NOX, PM and SO2. In addition, it contained data on the consumption of petrol, diesel and other oils, the latter of which were assumed to be lubricating oils. Emissions from the extrac-tion and processing of these fuels were tak-en into account using data received from public databases. Land use was also tak-

  Traditional  management  

Short  rotation  management  

Management  adjusted  to  

climate  change  

Extensive  management  

Carbon  content  of  annually  harvested    commercial  stemwood                                                    (kg  CO2e  year

-­‐1  ha-­‐1)  

6 880 5 560 6 620 4 700

Carbon  content  of  annually  harvested    energy  wood                                                    (kg  CO2e  year

-­‐1  ha-­‐1)  

- - 3 630 -

Average  carbon  storage  in  growing  stock                                              (kg  CO2e  ha

-­‐1)  

189 570 136 770 210 680 487 300

Average  carbon  storage  in  soil                                                                  (kg  CO2e  ha

-­‐1)  

221 280 225 870 158 770 325 230

Mean  annual  emissions  of  forest  operations                                (kg  CO2e  year

-­‐1  ha-­‐1)  

115 115 138 63

 

Table 1. Carbon statistics for the wood supply chain in Norway spruce stands, expressed in CO2 equivalent units (preliminary results).

Page 33: Forestcluster EffTech programme report

33

Figure 3. System boundaries and unit processes in the carbon footprint calculations for SC paper.

en into account. The amount of wood har-vested in the different scenarios was given in kg dry matter (including bark) per hect-are. The amount of wood harvested and the emissions from each harvesting operation varied between the scenarios.

Other data used in the calculations was derived from the KCL EcoData and Ecoinvent databases. The consumption of energy, wood and chemicals by the inte-grated SC mill vs. depending on the TMP/kraft ratio as described below (4.2.1, Wood properties and carbon footprint of the SC paper value chain, and Table 2).

The functional unit of the calculations was chosen to be 1,000 kg SC paper. The results of the different scenarios are pre-sented in Figure 4. The main differenc-es are caused by changes in energy con-sumption by the integrated paper mill. Al-though forestry operations account for on-ly a small proportion of overall GHG emis-

sions, the impact of forest management actions are visible in the end results.

The main conclusions of the carbon footprint calculations were as follows:• The most important source of

CO2 eq. was the production of the electricity used by the integrated SC mill

• The TMP/kraft ratio affects the results: the bigger the share of TMP, the bigger the footprint

• The biggest decrease in the carbon footprint of SC paper came from the use of energy wood for heat production in the “FAST, climate, wood as fuel” case

• Direct emissions from forest management operations had only a minor impact on the results

• The transport and manufacture of chemicals and fillers had only a minor impact on the results.

Page 34: Forestcluster EffTech programme report

34

Forestry management Extensive Tapio recomm.

Climate change adjusted

Fibre wood production

Growth rate of wood Slow Normal Fast Very fast SEC

TMP, MWh/t TMP 2.9 3.2 3.5 3.8

Heat from TMP process, GJ / t TMP

3.7 4.1 4.5 4.9

TMP, %/ t paper 55.5 48 40.5 33

Spruce, kg / t paper 509 440 371 303

Pine kraft, %/ t paper 12.5 20 27.5 35

Pine kraft, kg/ t paper 137.5 220 303 385

 

Table 2. Estimated changes in pulp and paper making processes due to changes in wood raw material properties.

4.2 Effect of wood properties on pulp and paper making processes

4.2.1 Wood properties and carbon footprint of the SC paper value chainThe effect of different wood production scenarios on the carbon footprint of the SC paper value chain was evaluated. With respect to mechanical pulping and paper-making, wood and fibre properties such as fibre dimensions or wood density are significant and were affected by the dif-ferent growth rates in the different forest management scenarios.

If the fibre properties change, the paper properties also change. Since the quality of the end product needed to be kept constant, certain modifications to the furnish were required (Table 2).

For the calculations, the following as-sumptions were made:

• The type of forest management affects the stem growth rate (slow to very fast)

• TMP refining of fast-grown wood to desired freeness requires more energy per pulp tonne than refining slow-grown wood

• The kraft content of the furnish has to be increased due to the lower strength properties of TMP made from fast-grown wood raw material

• The percentage increase in steam heat recovery (heat energy production) is assumed to correspond to the increase in energy consumption of refining

• For the evaluated wood production scenarios, the estimated growth rates, energy consumptions, required increase in kraft content, and the change in heat energy production are presented in the Table 2.

Page 35: Forestcluster EffTech programme report

35

Figure 4. Carbon footprint results for SC paper with different forestry scenarios.

4.2.2 Fractionation prior to pulpingPine logs representing three different growth rates were delivered for fraction-ation and pulping studies from Metla’s ex-perimental plot. The logs were debarked and chipped at VTT and their average ba-sic densities were determined. The max-imum density and coarseness difference of the samples were 33 kg m-3 and 15 µg m-1. Some sample logs were also an-alyzed at STFI Packforsk using the Sil-viscan method.

The chip size was reduced to less than 2 mm thickness in order to separate the earlywood and latewood using a method developed by VTT. The resulting pin chips were then steamed and impregnated with water to remove air and to fill the fibre lumens and pores with water. Chip frac-tionation was performed using the dense media fractionation method. The density of the fractionating medium was adjust-

ed according to the desired basic density of the fractions. Concentrated sodium sul-phate solution was used as the fractionat-ing medium. Continuously operating pro-totype fractionation equipment was used for the fractionation trials.

According to the results, dense media fractionation can be used to increase the basic density and coarseness difference of pin chips. The maximum basic densi-ty and coarseness difference were 150 kg m-3 and 50 µg m-1, respectively. The me-chanical pulping trials showed density to be a critical raw material parameter with respect to energy consumption and pulp quality. The energy consumption differ-ence of the fractions was 47%. The frac-tions also showed clear differences in pa-per sample surface roughness and den-sity values.

Standard cooking trials were also car-ried out, but only for medium growth rate fractions. Yield and brightness differences between these fractions were measured, but further studies are required to verify them conclusively. Use of optimized cook-ing conditions should reveal larger fibre level differences between fractions.

4.2.3 Pulping trialsThe aim of the pulping experiments was to show the effect of the wood properties of spruce and pine, as affected by growth rate, on process efficiency and pulp qual-ity.

Compared to the corresponding slow-grown raw material, the spruce and pine samples with higher growth rate had low-er fibre length, fibre wall thickness and density, and higher fibre width. Pine samples had lower fibre length and fibre width, but higher fibre wall thickness and density than the corresponding spruce samples.

In the kraft pulping experiments, a shorter cooking time (lower H-factor)

Page 36: Forestcluster EffTech programme report

36

was needed for cooking fast-grown pine to a given kappa than for slow-grown pine. The alkali consumption was lower and the yield was one unit higher. How-ever, the situation was the reverse in the case of spruce. The pulps made from fast-grown raw materials had lower fi-bre length than those made from slow-ly grown raw materials, and also had a lower tear index at a given refining en-ergy level. Bulk was also lower, while air resistance, internal bonding strength and light-scattering were higher. Tensile in-dex was higher in the case of fast-grown spruce, although no effect was seen with pine.

Compared to spruce, pine needed a longer cooking time to reach the speci-fied kappa, and had higher chemical con-sumption and lower yield (Figure 5). Pine pulps had lower average fibre length, low-er bulk, better internal bonding strength and higher light scattering. Slow-grown spruce had the lowest tensile strength and fast-grown pine had the lowest tear strength.

In the TMP refining experiments fast-grown wood raw materials required more

energy for refining to a given drainabili-ty than slow-grown materials (Figure 5). Fibre length was lower, but in the case of pine, the difference diminished during refining. The growth rate did not affect the tensile index but, in the case of pine, tear was lower for fast-grown wood. Light scattering was higher for fast-grown wood samples.

Pine pulps had lower fibre length, ten-sile and tear, but higher bulk and better light scattering than spruce pulps.

5. Future plans and key development needs

The project revealed the potential of in-tensive forest management to cost-ef-ficiently and sustainably ensure a high quality raw material supply for the Finn-ish forest and energy industries. These results, together with advanced analysis methods developed in this project, en-courage further research to extend the wood supply scenarios and calculations from a case-study level to the practical

Figure 5. Total yield of the cooking experiments (left) and specific energy consumption of the TMP refining trials (right) for spruce and pine samples with lower and higher growth rates.

Page 37: Forestcluster EffTech programme report

37

level. This work is continuing in the Eff-Fibre programme. The goal is to produce forest management scenarios at the na-tional level, applying up-to-date informa-tion from the latest National Forest In-ventory as the starting point. The defi-nition of the scenarios will be carried out as close cooperation between industri-al and research partners of the EffFibre programme. Scenarios will be calculated applying the methods developed in this project (such as the extended version of the Motti software).

In order to be widely applied, practi-cal-scale demonstration areas of intensive forest management are needed. In these areas, practical feasibility can be evalu-ated and practical methods for intensive forest management can be further devel-oped and modified. An extensive forest

area for demonstrating intensive man-agement is therefore being proposed by Metsähallitus and Metla to serve as a cru-cial element of the EffFibre programme.

There is a clear need to improve and develop means of environmental com-munication and to produce scientifically grounded solutions for evaluating the cur-rent most important environmental indi-cators for forest-based industry: biogen-ic carbon balance (in growing stock and soil), water footprint, land use change, and biodiversity. The EffFibre programme will take into account these research questions and use the knowledge gained not only in the New Value Chains project but also other VTT projects relevant to this area. Cooperation will be carried out with Metla, VTT and the Finnish Environ-mental Research Institute.

Figure 6. Integration of dense media fractionation and potential uses of the fractions (Hakala Juha, Kaijaluoto Sakari, Kalliola Anna and Simons Magnus).

Page 38: Forestcluster EffTech programme report

38

The studies on the influence of raw material properties on the efficiency of pulp and papermaking processes will be continued in EffFibre WP2. The defibra-tion phenomena of fast-grown wood in the TMP refiner plate gap and the effect of raw material/TMP quality on the be-haviour of furnish in the papermaking process will be examined in realistic pro-cess conditions. Increased knowledge of these mechanisms can be utilized in the development of energy efficient defibra-tion techniques for fast-grown wood from intensively managed forests. The effect of growth rate on the yield-saving potential of uniform and selective cooking and ox-ygen delignification will be clarified in co-operation with EffFibre WP4. The quali-ty of wood, pulp and end products will be linked to carbon footprint calculations. Regarding fractionation prior to pulping, as the next step, larger scale fractionation trials with commercial equipment followed by pilot-scale mechanical and chemical pulping are proposed.

6. Exploitation plan and impact of results

The project provides new research-based information regarding the potential of al-ternative forest management strategies to produce high quality wood and bio-mass for the forest industry. The results regarding the wood supply alternatives are applicable in practice, for example, as a support at different levels of deci-sion making concerning forest manage-ment or in forest policy making. The re-sults can also be applied in the form of

management recommendations on indus-trial wood production for forest owners. The project produces new information and develops new methods for assessing the energy and resource efficiency and envi-ronmental friendliness of alternative for-est cluster value chains. The results of the carbon footprint analysis can be applied in assessing the environmental friendli-ness of alternative wood supplies and wood processing chains, and in assessing their effects on emission trading. The re-sults are applicable in the decision making and planning of new practices for different parts of the forest cluster value chains.

The results regarding wood proper-ties can be applied in developing new measures in industrial wood processing. For example, chemical pulp mills are ide-al facilities for fractionation as the frac-tionation medium can be produced from fly ash or green liquor, both of which are readily available at the kraft pulp mill. Pulping concepts and some potential us-es of fractions are illustrated in Figure 6.

Page 39: Forestcluster EffTech programme report

39

7. Publications and reportsBehm, K., Liukkonen, S., Sokka, L., Wessman, H. Carbon footprint for dif-ferent forest management options. VTT Research Report. VTT-R-06769-10. 18 p.

Edelmann K., Seppänen V., Heikki-nen J., New value chains – Wood frac-tionation, VTT Research Report VTT-R-04772-09, 35 p+ appendix 24 p.

Edelmann K., Seppänen V., Heik-kinen J., New fiber properties through dense media fractionation prior to pulp-ing. http://www.vtt.fi/inf/pdf/sympo-siums/2010/S263.pdf. 2009 Wood and Fi-ber Product Seminar. VTT and USDA Joint Activity. Harlin, Ali; Vikman, Minna (eds.). VTT Symposium 263. VTT. Espoo (2010), pp. 89–94

Hynynen, J. 2008. Possibilities and methods to increase biomass production in Finland. Presentation at KCL’s Science Evening “Sustainability starts from the forest” 3.9.2008.

Hynynen, J. 2010. Comprehensive car-bon footprint analysis of the value chains of forest industry. Presentation. SHOK Summit 2010. Helsinki 20.4.2010.

Seppänen V., Heikkinen J. and Edel-mann K., Pine wood fractionation into early wood and latewood rich fractions, Poster at the 2010 International Work-shop on Wood Biorefinery and Tree Bio-technology, 21–23 June 2010, Örnskölds-vik, Sweden

Page 40: Forestcluster EffTech programme report

40

Functional genomics of wood formation towards knowledge-based breeding of wood

Project Manager

Duration of the project

Project budget

Project partners

University of Helsinki, Department of Biological and Environmental Sciences, Plant Biology (Ykä Helariutta, Jaak-ko Kangasjärvi, Kurt Fager-stedt)

University of Helsinki, Department of Agricultural Sciences (Teemu Teeri)

Metla, Muhos Research Unit, Punkaharju Research Unit (Katri Kärkkäinen)

Teemu Teeri, [email protected]

1.5.2008–30.9.2010

EUR 1,126,000

Role of participating organization

Populus trichocarpa genome mining, gene model analy-sis, molecular biology, gene expression analysis, vector construction for plant transformations. Gene expression analysis of transgenic overexpression lines. Growing and phenotyping transgenic lines.

Detection of natural variation in the pine PST-1 gene, correlations between phenotypic variation in decay re-sistance and molecular variation in the PST-1 gene (to-gether with Metla), second level candidate genes for stil-bene biosynthesis using DNA microarrays.

High throughput screening of stilbenes, natural genet-ic variation in heartwood extractives, certified pedigree seed for orchards.

Page 41: Forestcluster EffTech programme report

41

Abstract The Functional genomics of wood formation II project aims at uncovering key genetic determi-

nants responsible for wood development in forest trees with the aim of using them to control

both wood quality and growth in forest tree breeding. The approach is to use several species of

forest trees (birch, poplar, spruce and pine) and to utilize the special advantages of each species.

In order to find ways to improve both wood growth and quality and to provide applicable bio-

technology tools, transgenic poplar lines and phenotypes overexpressing various ERFs (Ethylene

Response Factors), and cytokinin signalling enhancer genes were generated and screened. In

addition, natural variation in total phenolics causing differences in decay resistance of pine was

studied, and was found to be highly inherited. New methodologies for early selection of this nat-

urally late expressing trait were developed.

Furthermore, lignin biosynthesis addressing the transport of monolignols in plant cells was

studied. This knowledge can be used in the breeding of trees for altered lignin content and qual-

ity in wood.

Keywords: birch, poplar, spruce, pine, ethylene, cytokinins, lignin, stilbenes, extractives,

growth, heartwood

Page 42: Forestcluster EffTech programme report

42

1. Project background

Kangasjärvi’s research group has ear-lier shown, together with our Swedish partner (Prof. Björn Sundberg’s research group at the Umeå Plant Science Cen-tre, UPSC), that ethylene regulates the formation of tension wood and that it is a stimulator of cambial growth when di-rectly applied to the stem (Love et al., 2009). Ethylene responses are mediat-ed through a large family of ERFs (Eth-ylene Response Factors) characterized by their highly conserved ERF domain. We were able to identify 173 putative ERFs in the black cottonwood (Populus tricho-carpa) genome (assembly version1.1) carrying the ERF domain. The diversity of ERFs explains the multifunctional role of ethylene in plants, and thus they are the key players in understanding the ulti-mate function of ethylene in wood devel-opment. With the specific primers against all ERF genes we used real-time quanti-tative PCR to screen for those that were induced by ethylene in poplar stem tis-sues and during TW formation. To mod-ify wood formation, the most prominent candidate ERFs (20 genes) were select-ed and gene constructs were made under a cambium/xylem specific promoter and successfully transformed (at UPSC, Prof. Björn Sundberg’s research group) to gen-erate poplar lines overexpressing the se-lected ERF genes.

Helariutta’s group has shown (in col-laboration with Rishikesh Bhalerao, UPSC) that cytokinin plant hormone is required for normal wood development (Nieminen et al. 2008) and trees with reduced cy-tokinin signalling are thinner and shorter than regular trees. To investigate whether enhanced cytokinin signalling reciprocally increases plant biomass production, He-lariutta’s group has constructed 25 cyto-kinin signalling overexpressor lines. These lines over-express the Arabidopsis CKI1 gene, which constitutively activates the cytokinin signalling pathway.

In the Fagerstedt research group, lig-nin biosynthesis and especially the po-lymerisation of lignin mainly by class III peroxidases has been previously studied. This has lead to the present investigation of the mechanism of how monolignols are transported from the cell cytoplasm in-to the apoplastic space where they are polymerised into lignin. There are three possibilities: Golgi mediated transport, ABC transporter mediated transport or, very hypothetically, transport through the plasma membrane through hydrophobic-hydrophilic interactions. In 2008 our col-laborator Prof. Lacey Samuels and co-workers published an article in which they very plausibly excluded the Golgi mediat-ed transport of monolignols from the pos-sible mechanisms. Hence, we are concen-trating on the other two transport possi-bilities.

In Scots pine, we have found in pre-vious studies ample genetic variation in the decay resistance of heartwood (Har-ju and Venäläinen 2002), which was to a large degree explained by heartwood extractives (total phenolics, mainly stil-benes; Venäläinen et al. 2003). Further-more, pilot studies suggest that fami-lies with high production of stilbenes af-ter seedling damage also have a high concentration of the same extractives in the heartwood of adult trees. Structural genes encoding enzymes of the stilbene pathway will be the first candidate genes for extractives. Of these, we have made a moderate sampling of the pine stilbene synthase gene PST-1 and identified seve-ral SNP or small indel polymorphisms on the structural gene and its promoter se-quence. Association mapping of the poly-morphisms on the candidate genes utilis-es linkage disequilibrium created through the history of the natural population and forms the molecular basis for breeding of heartwood quality traits in forest trees. Association of wood property traits with polymorphisms can be used as a first step in selecting parental trees for seed pro-

Page 43: Forestcluster EffTech programme report

43

duction. Metla has invaluable pedigree material for several forest tree species important in boreal forestry.

2. Project objectivesThe main objectives of the growth rate of wood tasks are to:• Generate and screen for transgenic

poplar lines and phenotypes by overexpressing various ERF genes with modified wood formation (Kangasjärvi) and with enhanced cytokinin signalling or its downstream genes (Helariutta)

• Study the detailed function of various cytokinin signalling genes or downstream genes and ERFs in wood formation

• Obtain trees overexpressing ERFs or cytokinin signalling genes with enhanced wood formation

• Study the hormonal interactions crucial to wood formation.

In the genetic components of lignin formation in wood task, the main objec-tive is to find out how we can alter the lignin content and composition in the cell walls, especially in Norway spruce. Hence, we are studying what are the possible monolignol transporter genes with the use of bioinformatics and by:• Using specific-ABC-type transporter

inhibitors in Norway spruce to find out the mode of transport

• Isolating plasma membranes from Norway spruce xylem to identify the transport proteins

• Synthesizing activatable monolignols for use in ‘click chemistry’ experiments to label the transport proteins in the isolated plasma membranes

• Testing the affect of free monolignols on the mortality and growth of plant cells.

In the decay resistance and heart-wood extractives task, the key objectives are to:• Explore the natural variation in the

pine PST-1 gene• Identify second level candidate

genes for stilbene biosynthesis• Develop high throughput non-

destructive analysis methods for heartwood extractives (stilbenes)

• Investigate natural genetic variation in heartwood extractives in tree breeding populations of Scots pine

• Produce certified pedigree seed for orchards.

3. Research approachA full understanding of the factors regu-lating wood formation is essential in or-der to pursue the aim of modifying wood properties. We have previously shown that both ethylene and cytokinin are im-portant hormonal mediators of xylogen-esis and that tree biotechnology offers vast potential for xylogenesis stimula-tion. We therefore generated transgen-ic poplar trees with enhanced hormonal function to further study, dissect and con-firm the precise genes that control and modify wood formation in these hormon-al pathways. This enables us to also pro-vide tools that can be transferred to oth-er tree species as required.

In our study of lignin biosynthesis and its regulation, we are aiming to gain a fundamental knowledge on how the amount and quality of lignin could be al-tered in wood. Previously, we have con-centrated on lignin polymerisation by per-oxidases. In this current project, we are focussing on the transport of monolignols into the cell wall space. If this step can be altered, we could also change the lig-nin content and composition in the xylem by applying the findings from the Norway spruce breeding programme.

Natural variation in the pine stilbene

Page 44: Forestcluster EffTech programme report

44

biosynthesis and regulatory genes are the causative reasons for inherited differenc-es in stilbene content and decay resis-tance of pine heartwood. In this part of the project, analyses of genetic variation in the induction of stilbene biosynthesis due to artificial abiotic stress (mechani-cal wounding) is conducted. The goal is to find markers for pine heartwood quality that could be used in early selection. Wide genetic variation found in stilbene con-centration between mechanically injured seedlings offers a possibility for early se-lection of naturally late expressing trait.

4. Results

4.1 Growth rate of wood (ethylene)Kangasjärvi’s research group has mined and screened the expression profile for all the ERF genes in response to ethyl-ene in woody tissues. To modify and en-hance wood formation, the most promi-nent candidate ERFs (20 genes) were se-lected to be over-expressed under a cam-bium/xylem specific promoter, and suc-cessfully transformed at the Umeå Plant Science Centre (UPSC) (Prof. Björn Sund-berg’s research group) to generate over-expression lines. Additionally, we made an antisense gene construct encoding CTR (a negative regulator of the ethyl-ene signalling pathway) and transformed to poplar to enhance ethylene responses. By summer 2008, approximately 140 in-dividual lines over-expressing 20 different ERF genes were generated, after the lines were in vitro propagated at UPSC and lat-er also at the University of Helsinki. The rate of over-expression for each individual ERF line was determined by quantitative real-time PCR; almost all of the lines suc-cessfully over-expressed the target ERF gene. All of the 140 ERF lines were grown under controlled greenhouse conditions during autumn 2009 and winter 2010

until they reached an average height of 150 cm. Height growth and stem diam-eter from four positions were measured weekly. As a result, we have observed approximately 30 ERF lines under vari-ous over-expressed ERF genes that show enhanced growth rate measured in stem volume. The increase in stem volume of these 30 ERF lines is typically between 25 and 50% after a 14-week growth peri-od. Additionally, the height growth of the ERF lines is usually also slightly stimu-lated when compared to wild type trees. In general, over-expression of ERF genes did not result in phenotypes with aber-rant stem growth habit or leaf morphol-ogy. We have also observed one ERF gene which resulted in stunted growth but which also contributes to very dense xylem structure when over-expressed in poplar. In the other ERF lines we have not observed any striking anatomical chang-es (e.g. in cell density of xylem) to date. Almost all of the lines were also analyzed using stem samples for possible mod-ifications in wood chemistry using Fou-rier Transform Infrared Spectroscopy (FTIR) at UPSC (Prof. Björn Sundberg’s research group). We were able to con-firm for five ERF genes that, when over-expressed, they cause very clear and consistent changes in wood chemistry. For some of these ERF genes FTIR analy-sis showed significant enrichment of sug-ars and/or increased glycosidic linkage. In addition, we have also detected sig-nificant alterations in lignin composition or deprived amounts of lignin as a result of certain over-expressed ERF genes. On the other hand, we have also observed ERF-lines under certain overexpressed ERF genes which have enhanced growth rate without any obvious changes in wood chemistry. In conclusion, it appears that there are many very interesting pheno-types with enhanced (and one depressed) growth rate and modified wood chemistry.

Page 45: Forestcluster EffTech programme report

45

4.2 Growth rate of wood (cytokinin)Helariutta’s group has shown (in collabo-ration with Rishikesh Bhalerao, UPSC) that the plant hormone cytokinin is required for normal wood development (Nieminen et al., 2008) and that trees displaying re-duced cytokinin signalling are thinner and shorter than regular trees. Helariutta’s group has constructed 25 transgenic Pop-ulus lines with enhanced cytokinin signal-ling. These lines over-express an Arabidop-sis CKI1 gene, which activates the cytoki-nin signalling pathway. Based on prelimi-nary results from 14-week-old trees; vol-ume and biomass has increased by about 50% in the best CKI1 over-expressing lines in greenhouse conditions. Stem diame-ter has increased and stem height has re-mained the same as the wild type in the best CKI1 over-expressing lines. The best lines are being selected for further, more detailed, analysis.

4.3 Genetic components of lignin formation in woodLignin, the second most abundant bio-polymer on earth, plays a crucial role in the structural integrity of cell walls and the stiffness and strength of the plant body. Monolignol biosynthesis and the lignin polymerization process in the cell walls have been well-studied in recent years. However, the monolignol trans-port mechanism from living cells to cell walls is still unclear. In this project we are concentrating on this mechanism and fo-cusing especially on the involvement of transporter proteins located at the plasma membrane and identifying them, if any, using mainly Norway spruce (Picea abies) and its lignin-forming cell culture.

We have drawn on the bioinformatics of the spruce EST database and available microarray data in public domains in or-der to find possible monolignol transport-er genes. As ATP-binding cassette (ABC) transporters are involved in transport-

ing a variety of small molecules and sec-ondary metabolites in plants, it is there-fore plausible that they also transport monolignols. Approximately 300 puta-tive ABC transporter genes are found in the spruce EST database (http://comp-bio.dfci.harvard.edu/tgi/cgi-bin/tgi/gi-main.pl?gudb=spruce), a few of which are specifically upregulated in differenti-ating xylem, where lignification starts, in comparison to young needles or phloem in white spruce (Picea glauca). TC31887 and CO482045 have a closest homolog in Arabidopsis, AtABCB15, which is coordi-nately expressed with monolignol biosyn-thetic genes. A homolog of TC15686 is al-so found to be upregulated in developing xylem in another coniferous tree species, loblolly pine (Pinus taeda). In addition, the compression wood library, produced from reaction wood which is characterized by higher lignin content in the pine EST database (http://compbio.dfci.harvard.edu/cgi-bin/tgi/gimain.pl?gudb=pine), has one ABC transporter gene, TC83625.

In order to find the monolignol trans-port proteins in plasma membranes, we are using a novel ‘click chemistry’ ap-proach with purified plasma membrane proteins from Norway spruce developing wood or lignin-forming cell culture, and azide-conjugated monolignols or adenine nucleotides. Click chemistry (photoaffin-ity labelling) is an approach used to en-able the specific detection of proteins in-teracting with target small molecules. We are presently in the process of synthesiz-ing monolignol azides which will then be activated as reactive compounds under UV-radiation, and which then make co-valent bonds with any plasma membrane proteins in their surroundings in order to fish out transporters. Their partial amino acid sequences will be used to identify the corresponding genes. In order to conduct this experiment successfully, it is essen-tial to obtain pure plasma membranes. Al-though a general method of preparation of a microsomal fraction (including plas-

Page 46: Forestcluster EffTech programme report

46

ma membranes) and two phase-partition system is available, the optimal condi-tions for developing Norway spruce wood have not been suggested. We have been optimizing the technique to find the best concentrations with dextran (Dex) and polyethylene glycol (PEG) and potassium chloride for the two-phase partition sys-tem. We have tested Dex/PEG concentra-tions ranging from 5.9 to 6.5% together with 3, 4 or 5 mM potassium chloride. In addition, we have also analyzed the in-fluence of water soluble polyvinylpyrrol-idone (PVP) and water-insoluble polyvi-nylpolypyrrolidone (PVPP) on the prepara-tion of the microsomal fraction. We have come to the conclusion that microsome preparation with PVPP, followed by a two-phase partition system with 6.4% Dex/PEG 3mM potassium chloride gives the best yield and purity of plasma membrane proteins, as judged by the Bradford pro-tein assay and glucan synthase and cyto-chrome-c oxidization assays. This condi-tion will be used to perform click chemis-try with monolignol azides and activated adenine nucleotides.

In another set of experiments we have tested the effect of ABC-type trans-porter inhibitors on radioactively labelled monolignol transport in a lignin-produc-ing Norway spruce tissue culture line, and have concluded that ABC-type transport-ers do not seem to take part in this trans-port. This makes the ‘click chemistry’ ap-proach even more important.

4.4 Decay resistance and heartwood extractivesTrees with a genetic capability for high stilbene production could be of consid-erable value to forestry. In order to ob-tain seedlings with this genetic potential, screening of the stilbene production of the genotypes currently used to produce commercial seed for forest tree nurser-ies would be beneficial. To enable such screening, we have studied the correla-

tion between the heartwood quality (= concentration of stilbenes, concentration of total phenolics) of the progeny growing in a field test and the heartwood quality of the seed orchard grafts (= their moth-ers). The high positive correlation found between the concentration of total pheno-lics and stilbenes as well as in wood den-sity between the mothers and their prog-enies indicates that selective clonal col-lection of Scots pine seed is a promising means of obtaining seedlings with genet-ic potential to produce high quality heart-wood as mature trees.

High throughput screening of stil-benes: from wet chemistry to optical analyses.

High throughput screening of stil-benes is needed for different kinds of wood samples. Scots pine stilbenes PS and PSM exhibit unique Raman spectral features, which enables the measure-ment of stilbene content in wood opti-cally. The aim is to replace laborious wet chemistry with optical analyses in stilbene quantification. In the initial stage, howev-er, both methods are needed for model-ling. A quantification model has been per-formed at Aalto University (Dr. Jääske-läinen) for a single Scots pine progeny tri-al. This task has been carried out jointly with the “Tannins and stilbenes for wood protection” project belonging to the Fu-Bio programme launched by Forest clus-ter Ltd., too.

The Metla Central Laboratory in Van-taa has developed a fast GC-MS analy-sis method for the quantification of PS and PSM stilbenes. Concurrently, a meth-od for optical UVRRaman analysis of stil-benes in increment cores was also devel-oped at the Aalto University School of Sci-ence and Technology (A!). The results of GC-MS and UVRRaman have been used to formulate a preliminary quantification model for stilbenes (Dr. Jääskeläinen at A!) for a single Scots pine progeny tri-al. According to the preliminary results, there is a clear positive correlation be-

Page 47: Forestcluster EffTech programme report

47

tween stilbene concentration measured by chemical (GC-MS) and optical (UVRR) methods. Accordingly, UVRR spectrosco-py on solid Scots pine heartwood samples could provide a rapid method for stilbene content measurement to replace labori-ous wet chemistry.

4.4.1 Natural genetic variation in heartwood extractivesAnalysis of stilbenes by GC-MS (at the Metla Central Lab) and total phenolics (at Metla’s Punkaharju Research Unit) of in-crement cores obtained from adult Scots pine breeding material from a 44-year-old field trial (half-sib progeny trial) have been finished and preliminary estimates of the genetic parameters have been de-termined. According to the preliminary re-sults, in the subset of breeding material from the progeny trials, variation in stil-bene concentration and in total pheno-lics is wide and highly inherited. Another progeny trial with identical families was sampled in April 2010. Analysis of stil-benes and total phenolics of the incre-ment cores from this trial enables estima-tion of the interaction between the geno-type and the environment.

In young seedlings, mechanical wounding induces production of heart-wood stilbenes. According to the pilot study (Harju et al. 2009), there is wide phenotypic and genetic variation in the wounding response between seedlings. This offers a possibility for early selection of naturally late expressing traits. The re-sponse to mechanical wounding is stud-ied in four-year-old seedlings. As regards the wounded seedlings sample, analysis of stilbenes has been conducted for 1/5 of the material at the Metla Central Lab. Ac-cording to the preliminary results, there is wide variation in stilbene concentration between the seedlings.

4.4.2 Stilbene biosynthesisInduction of stilbene (pinosylvin) synthe-sis was tested in 6-week-old pine seed-

lings under ultraviolet induction. 24 hours after induction pinosylvin and pinosylvin monomethyl ether could be extracted from needles of the plantlets (analysed on TLC plates). Induction of stilbene syn-thase gene PST-1 transcription appears to be very fast, taking place within a few hours after induction. In addition, this in-duction is independent of translation, in-dicating that the factors needed for PST-1 gene transcription are preformed in pine cells. PST-1 and its possible isoenzyme variants belong to the plant polyketide synthase (PKS) superfamily and they share considerable sequence and antigen-ic similarities. Using an antibody raised in rabbits against the Gerbera 2PS enzyme, another member of the PKS family, we have been able to monitor the pine PST protein directly. The protein becomes de-tectable between 6 and 24 hours after in-duction.

In order to identify genes that are transcribed differentially (and possibly correlated with PST-1 transcription) in seedlings with different genetic capabili-ty of stilbene biosynthesis under induction and heartwood formation, we have pre-pared samples for transcriptome analy-sis. High and low reacting seedlings were either wounded or induced with UV and tissue samples were collected at different time points. So as to find a suitable win-dow for investigating genes induced dur-ing heartwood formation, sampling of in-crement cores through the growing sea-son is underway to measure expression of the PST-1 gene and the presence of PST protein.

4.4.3 Certified pedigree seed for orchardsThe analysis of stilbenes (at the Metla Central Lab) and total phenolics (at Met-la’s Punkaharju Research Unit) of the in-crement cores from the seed orchard clones has been completed. According to this pilot-scale study, there is highly sig-nificant correlation in the concentration of

Page 48: Forestcluster EffTech programme report

48

stilbenes PS (r=0.83) and PSM (r=0.60) and total phenolics (r=0.50) between the heartwood of the grafts and their proge-nies growing in the progeny trial. More-over, the clonal repeatability in the con-centration of stilbenes and total pheno-lics between the clones is remarkable. Accordingly, selective clonal collection of Scots pine seed offers a promising means of obtaining seedlings with the genetic potential to produce high quality heart-wood as mature trees. However, for fi-nal conclusions to be made regarding the certified pedigree seed for orchards, new larger sampling covering several seed or-chards is required.

5. Future plans and key development needs

We selected the best 15 to 20 ERF lines (out of 20 ERF genes and 140 lines) to be re-grown under greenhouse conditions during summer 2010 to verify enhanced growth rate and wood chemistry. Know-ledge of the fitness of trees over-express-ing ERF genes under natural environmen-tal conditions is of great value, since it is possible to apply this biotechnology also to other plant species that may be more appropriate for forest plantations. Thus, our goal is to establish field trials to study the fitness of ERF lines under natural growth conditions, but only after transformation of selected ERF genes in-to a Finnish hybrid aspen genotype. This will be done due to fact that the current hybrid aspen genotype for our ERF lines originates from former Czechoslovakia and is not able to acclimate and survive the Finnish winter. The tree transforma-tion will be conducted according to pub-lished protocols. Extensive understand-

ing of the factors regulating wood forma-tion is essential to the aim of modifying wood properties. Kangasjärvi’s and Helar-iutta’s research groups have shown that both ethylene and cytokinin are hormon-al mediators of wood formation. Further-more, they are possibly strongly connect-ed to each other. In addition, we plan to study potential downstream genes, which are regulated by ERFs.

Among the extractives of Scots pine, the stilbenes pinosylvin (PS) and its monomethyl ether (PSM) provide decay resistance in timber as well as defence for living trees against pathogens and pests. The long-term aim is to enhance the sup-ply of high quality and sustainable materi-als for the mechanical wood industry. Nat-urally durable heartwood timber is a re-alistic alternative to impregnated timber.

6. Exploitation plan and impact of results

Ethylene and cytokinin responsive wood traits are achieved in the tree by the spe-cific expression of genes during wood de-velopment. We have identified several genes that regulate various hormone re-sponsive wood properties, and the perfor-mance of these genes is being systemati-cally tested in transgenic trees. Our main aim is to establish novel tools for tree breeding and to improve the future com-petitiveness of the Finnish forest industry. This knowledge can also be transferred from poplar to more widely used planta-tion species such as Eucalyptus through marker assisted breeding programmes or transgenic technology. Over-expression of ERF genes or a cytokinin signalling en-hancer gene (such as CKI1), for instance, has the potential to result in permanent

Page 49: Forestcluster EffTech programme report

49

stimulation of wood formation. Utilization of these tree biotechnologies could offer a valuable means of providing an enhanced supply of material for the paper industry and for bioenergy production. Additional-ly, these biotechnologies could facilitate higher wood productivity per land area, potentially reducing the need to exploit further natural forest areas.

As lignin forms a major proportion of wood in our forest trees and affects, for example, the strength of sawn tim-ber products and the pulping character-istics of wood, it is important to know how lignin biosynthesis is regulated in developing xylem cells. Once this is de-termined, we can apply this knowledge in tree breeding programmes to develop trees with modified wood properties.

Trees with a genetic capability for high stilbene production could be of high prac-tical value to forestry. Genotypes having the capability for high stilbene production are expected to be resistant against pests and pathogens as living trees, and at ma-turity to produce heartwood for durable timber. It would therefore be important to screen the stilbene production of geno- types that produce commercial seed for forest tree nurseries.

7. Publications and reports

Scientific journals

Elo, A., Immanen, J., Nieminen, K., Helariutta, Y. 2009. Stem cell function during plant vascular development, Se-min Cell Dev Biol. 9:1097-106, Review.

Fagerstedt, K.V., Kukkola, E.M., Koistinen, V.V.T., Takahashi ,J. and Marjamaa, K. 2010. Cell wall lignin is polymerised by class iii secretable plant peroxidases in Norway spruce. Journal of Integrative Plant Biology 52: 186–194.

Harju, A., Venäläinen, M., Laakso, T., Saranpää, P. 2009. Wounding response in xylem of Scots pine seedlings show wide genetic variation and connection with the constitutive defence of heart-wood. Tree Physiology 29(1): 19–25.

Kukkola, E., Saranpää, P. and Fa- gerstedt, K.V. 2008. Juvenile and com-pression wood cell wall layers differ in lig-nin structure in Norway spruce and Scots pine. IAWA journal 29: 47–54.

Love, J., Björklund, S., Vahala, J., Hertzberg, M., Kangasjärvi, J., Sun-dberg, B. 2009. Ethylene is an endog-enous stimulator of cell division in the cambial meristem of Populus. PNAS 106:5984-5989.

Marjamaa, K., Kukkola, E.M. and Fagerstedt, K.V. 2009: The role of xy-lem class III peroxidases in lignification. Journal of Experimental Botany 60 (2): 367–376.

Page 50: Forestcluster EffTech programme report

50

Nieminen, K., Immanen, J., Laxell, M., Kauppinen, L., Tarkowski, P., Karel Dolezal, K., Tähtiharju, S., Elo, A., Decourteix, M., Ljung, K., Bhalerao, R., Keinonen, K., Albert, VA., Helar-iutta, Y., 2008. Cytokinin signaling reg-ulates cambial development in poplar. PNAS 105:20032-20037.

Conference presentations and posters

Elo, A. , Nieminen, K., Immanen, J., Helariutta, Y., Cytokinin signaling regu-lates cambial development. Auxin 2008, October 2008, Marrakech, Morocco (post-er).

Elo, A., Functional Genomis of Tree Bio-mass – (CONICYT) Academy of Finland, SusEn Programme, Annual Seminar Oc-tober 2009, Helsinki (oral presentation)

Fagerstedt, K.V., Marjamaa, K. and Kukkola, E., Peroxidases and lignifica-tion during xylem development in Norway spruce: Combination of substrate speci-ficity and location. Gordon conference on Plant Cell Walls, Boston, 2–7.8.2009.

Fagerstedt, K.V., Marjamaa, K. and Kukkola, E.M, Lignification and the role of class III plant peroxidases in its poly-merisation in Norway spruce. Final sym-posium of COST Action E50, Wageningen, Holland, 8–11–7.2009.

Felten, J., Vahala, J., Love, J., Gorzsás, A., Kangasjärvi, J., Sund-berg, B., Ethylene signaling via Ethylene Response Factors (ERFs) modifies wood development in hybrid aspen. Interna-tional Workshop on Wood Biorefinery and Tree Biotechnology, June 21st–23rd, 2010, Örnsköldsvik, Sweden (poster)

Harju, A., Venäläinen, M., Partanen, J., Jääskeläinen, A.-S., Hatakka, R., Kivioja, A., Tapanila, T., Tanner, J. & Kilpeläinen, P., Cost-effective analysis of Scots pine stilbenes. Sixth WSE meet-ing, October 21–22, 2010, Tallinn, Esto-nia. (poster)

Immanen, J., Elo, A., Zhang, J., Hela-riutta, Y., Academy of Finland SusEn An-nual Seminar 31.5.2010, Helsinki , poster

Immanen, J., Nieminen, K., Elo, A., Helariutta, Y., Cytokinin signalling in the regulation of cambial development in Pop-ulus. 2nd Poplar Symposium March 2009 Göttingen, Germany (poster)

Immanen, J., Nieminen, K., Elo, A., Helariutta, Y., Cytokinin signalling in the regulation of cambial development in Pop-ulus. IV Seminario de Biologia Vegetal, October 2009 La Serena, Chile (poster)

Immanen, J., Nieminen, K., Elo, A., Helariutta, Y., Cytokinin signalling in the regulation of cambial development in Populus. International Workshop on Wood Biorefinery and Tree Biotechnology, June 2010, Örnsköldsvik, Sweden (poster)

Jääskeläinen, A.–S., Hatakka, R., Kivioja, A., Harju, A., Partanen, J., Venäläinen, M. A rapid method to de-termine pinosylvin content in pine heart-wood by UV resonance Raman spectros-copy. COST FP0901 Meeting in Vienna, 4–5 February 2010.

Jääskeläinen, A.–S., Hatakka, R., Kivioja, A., Harju, A., Partanen, J. & Venäläinen, M., Pinosylvin distribution in wood as studied by UV resonance Ra-man spectroscopy. EWLP 2010 – 11th Eu-ropean Workshop on Lignocellulosics and Pulp, August 16, 2010, University of Ham-burg. (poster)

Page 51: Forestcluster EffTech programme report

51

Nieminen, K., Immanen, J., Elo, A., Helariutta, Y., Cytokinin signalling in the regulation of cambial development in Pop-ulus. XVI Congress of the Federation of European Societies of Plant Biology (FES-PB 2008), August 2008, Tampere, Finland (oral presentation)

Nieminen, K., Immanen, J., Elo, A., Helariutta, Y., Cytokinin signaling reg-ulates cambial activity, 19th International Conference on Arabidopsis Research July 2008 Montreal, Canada (poster)

Nieminen, K., Immanen, J., Elo, A., Helariutta, Y., Cytokinin signaling regu-lates cambial activity. Plant Vascular De-velopment May 2009 Banff, Canada (oral presentation)

Paasela, T., Lim, K.-J, Teeri, T.H., In-duction of pinosylvin synthesis in Pinus sylvestris, XVI Congress of the Federa-tion of European Societies of Plant Biol-ogy (FESPB 2008), August 17–22 2008, Tampere, Finland (poster)

Partanen, J., Harju, A., Venäläinen, M. & Kärkkäinen, K., Towards exploita-tion of the quantitative variation in Scots pine heartwood extractives. Wood Struc-ture and Properties, September 6, 2010, Slovakia (poster)

Takahashi, J., Wingsle, G., Kärkönen, A. and Fagerstedt., K.V., Exploring the monolignol transport mechanisms in Nor-way spruce (Picea abies). Cell Wall meet-ing in Porto, Portugal, August, 2010

Vahala, J., Love, J., Björklund, S., Kangasjärvi, J., Sundberg, B., Ethylene is an endogenous stimulator of cell divi-sion in the vascular cambium. XVI Con-gress of the Federation of European Soci-eties of Plant Biology (FESPB 2008), Au-gust 17–22 2008, Tampere, Finland (oral presentation)

Vahala, J., Love, J., Rao, S., Kumar, M., Felten. J., Sundberg, B., Kangas-järvi, J., Ethylene stimulates cambial cell division and enhances wood formation in Populus. IX Finnish Symposium on Plant Science, May 17th–19th, 2010, Joensuu, Finland (oral presentation)

Vahala, J., Love, J., Rao, S., Kumar, M., Sundberg, B., Kangasjärvi, J., Eth-ylene regulates cambial cell division and wood formation in Populus. 20th Interna-tional Conference on Plant Growth Sub-stances, June 28th–July 2nd, 2010, Tarra-gona, Spain (poster)

Doctoral theses

Love, J. (2009) New insights into et-hylene signalling and wood development. Acta Universitatis Agriculturae Sueciae. Doctoral thesis No. 2009:88.

Nieminen, K. (2009) Cytokinin signalling in the regulation of cambial development. Dissertationes bioscientiarum molecula-rium Universitatis Helsingiensis in Viikki. Doctoral thesis 22/2009.

Page 52: Forestcluster EffTech programme report

52

Virtual pulp bleaching (VIP)

Project Manager

Duration of the project

Project budget

Project partners

Aalto University/Chemical Engineering

Aalto University/Forest Products Chemistry

VTT Technical research centre of Finland (KCL 1.6.2008–14.6.2009)

VTT Technical research centre of Finland

Lappeenranta University of Technology (LUT)

Tapani Vuorinen, [email protected]

1.6.2008–30.8.2010

EUR 1,088,000

Role of participating organization

Unit operation, mass transfer and thermodynamic modelling; reaction kinetics modelling and brightness model development; simulator programming

Evaluation of the relevant reactions involved, role of phenolic hydroxyl groups in oxygen delignification, UV Raman measurements, assistance in validation of the model

Chromophore chemistry for the brightness model via model compound analyses and pulp bleaching experiments; complementing chemical analyses for the oxygen delignification and intrinsic viscosity model

Oxygen delignification chemistry, bleaching chemistry, analytical methods

Mill trials planning and implementation together with Fibera Oy

Page 53: Forestcluster EffTech programme report

53

AbstractA package of computer models was developed to enable simulation of pulp bleaching with-

out experimental limitations. The models are based on detailed descriptions of actual elemen-

tary chemical reactions, thermodynamics and transport phenomena. The simulation was vali-

dated as giving realistic outputs (kappa number, brightness, AOX, OX, etc.) for commonly used

bleaching stages and their combinations. The simulation model enables testing of theories and

new bleaching concepts (chemistry, chemicals and processes) and thus provides a new, eco-

nomic means of increasing knowledge of bleaching processes. The model can also be used for

guiding/focusing laboratory experiments. In the future, the tool will be applied to the develop-

ment of new industrial bleaching concepts. The approach will also be expanded to cover phe-

nomena-based simulation of wood chip cooking and biorefining.

Keywords: chemical pulp, bleaching, chromophore, oxygen, chlorine dioxide, alkaline, hy-

drogen peroxide, modelling, simulation, chemical engineering

Page 54: Forestcluster EffTech programme report

54

1. Project background

VIP (Virtual Pulp Bleaching) was a follow-up project of the ABLE (Advanced Bleach-ing Plant) project. The purpose of ABLE was to gather fundamental theoretical knowledge on bleaching chemistry and related physicochemical phenomena (ki-netics of bleaching reactions, mass trans-fer, and thermodynamics) and to develop computer models based on that knowl-edge. Bleaching chemistry was described at the micro-scale level (molecular reac-tions and mass transfer), and micro-scale models were combined with macro-scale models describing the flow of pulp sus-pensions and liquors in mixers, reactors and washers. In ABLE, the project team managed to develop models for D0 and EOP stage chemistries as well as for pulp washing. These models were validated against data collected from an industrial process and from laboratory experiments. Models for the O and A stages were de-veloped in part, but not validated.

ABLE proved that fundamental infor-mation from a range of sources can be gathered and used to produce models that describe bleaching chemistry at a highly detailed level. The developed tool could distinguish the effects of mass transfer, thermodynamics and chemical reactions. ABLE models gave valuable information on different chemical phenomena that could not be directly measured by ana-lytical means. General bulk parameters used in the industry, such as kappa num-ber, viscosity, TOC, COD and AOX, were computed from molecular level informa-tion during the simulations. The models can be considered as a virtual bleaching laboratory that can be utilized in a variety of ways. The development of ABLE mod-els can be regarded as a considerable ac-ademic achievement; fragments of theo-retical knowledge from numerous sourc-es of literature were collected, assessed,

documented and combined into an effec-tive computational tool.

Reaction kinetic models for all bleach-ing stages of interest could not be devel-oped under ABLE, as the amount of work required exceeded available resources. The VIP project was therefore tasked with completing the work initiated by the ABLE project.

2. Project objectivesThe objectives of the project were to:• Compile a library of oxygen and

peroxide reactions • Complete the current chlorine

dioxide reaction library to enable simulation of the alkaline extraction and final brightening stages

• Develop a brightness model, i.e. specify the relevant chromophoric structures and their relative contribution to pulp colour

• Develop an intrinsic viscosity / polymer chain degradation model

• Validate the developed models with literature/laboratory/industrial data

• Train project partners to use the developed simulation model.

3. Research approachThe goal was to develop mechanistic models for reactions of oxygen chemi-cals with fibre wall structures, includ-ing chromophores. The reaction routes and, where available, kinetic parameters were drawn mainly from literature. Ac-tual compounds were favoured, but lig-nin and some of its degradation prod-ucts needed to be modelled as pseudo-compounds closely resembling the actu-al structures detected directly from the lignin polymer, or from lignin model com-pound studies. In addition, the aim was

Page 55: Forestcluster EffTech programme report

55

to build the reaction chemistry library from elementary reactions, even though, in order to keep the reaction library suf-ficiently simple, some of the implement-ed reactions consist of numerous reac-tion steps. In order to enable the com-putation of brightness, the Kubelka-Munk parameters needed to be determined for each compound.

The reaction library for oxygen chem-icals was intended to be general enough to enable it to be used for simulation of the O, E (with no enforcement or with ox-ygen and/or peroxide enforcement) and P stages. Lignin oxidation experiments in alkaline conditions under oxygen pressure were conducted to obtain validation data for lignin reaction mechanisms. Alkali ex-traction laboratory experiments (with or without oxygen and/or peroxide enforce-ments) were conducted for D0 pulp to val-idate the selected chromophore formation and degradation reaction scheme. Intrin-sic viscosity measurements conducted for the alkali extraction stage pulps vali-dated also the carbohydrate degradation scheme. Reactions involving only inorgan-ic compounds (oxygen chemical and sul-phur compounds) were taken from litera-ture. The reactions of extractives were al-so included. Literature data was used for further validation of the overall oxygen chemical library.

The chromophore chemistry in the chlorine dioxide stages was also defined based on laboratory D0 and D1 experi-ments. With a set of experiments, the role of chlorite was elucidated and the findings entered in the chlorine dioxide reaction library. The reactivity of chlorine dioxide with the hydroxylated quinone structures formed at the alkaline extrac-tion stage was verified.

Kubelka-Munk parameters (k457) of potential chromophoric structural units were obtained from the UV-VIS mea-surements of corresponding model com-

pounds in aqueous solutions. Sheet im-pregnation studies were conducted to val-idate the link between liquid absorptivity and sheet brightness.

4. Results

4.1 Oxygen chemical reaction libraryFor the development of schemes for lig-nin reactions with oxygen in alkaline conditions, a vast set of laboratory ex-periments and analyses were conduct-ed. Softwood and hardwood kraft lignins (from MeadWestvaco) were used as the starting materials. Experiments were con-ducted at varying temperatures (90 and 110°C) and initial oxygen pressures (5 and 8 bar) with a constant alkali charge and a reaction time of 240 minutes. Sam-ples were taken as a function of time (5, 10, 20, 30, 60, 120, 180 and 240 min) to reveal the reaction kinetics. Sever-al compounds/structures were analyzed from the samples. The formation of acid-ic products was followed by pH, conduc-tometric titration (total acids) and cap-illary electrophoresis (CE giving small molecular weight acids). The CE meth-od was partly developed in this project. COD (Cr and Mn), TIC and TOC were al-so determined in order to monitor overall changes in the sample composition. The dried samples were analyzed by 31P NMR, which gave information on the structur-al changes of the lignin, particularly dif-ferent types of hydroxyl groups. Kraft lig-nins were also analyzed for metals, sul-phur, elementary composition (C, H, N) and methoxyl group content. The degree of demethylation was monitored based on released methanol. In addition, lignin degradation was monitored using molar mass and UV-VIS measurements. A de-crease in phenolic units and an increase

Page 56: Forestcluster EffTech programme report

56

in conjugated phenols were detected ac-cording to the ionization UV-VIS spectra. Decreasing oxygen pressure was recorded during the entire reaction period. This en-abled the estimation of oxygen consump-tion in the reactions.

In general, the analysis results were logical. When the temperature and ini-tial pressure in the delignification stage were increased, the lignin reactions be-came faster and the effects on the lignin structure were more emphasized. During the first 60 min the reactions were rapid. This was evident from the degradation of phenolic lignin, formation of organic ac-ids, and increase in total charge and ox-ygen consumption. After the first 60 min period the reactions slowed.

Both volatile and non-volatile carboxy- lic acids were detected in the reaction mixtures. Of these, the former group in-cludes formic and acetic acids and the latter includes dicarboxylic acids and hy-droxy acids. The dominating acids were the formic, oxalic, acetic, and glycolic ac-ids. The concentrations of these four ac-ids corresponded to almost 90% of all ac-ids content. The general trends in SW de-lignification experiments, detected by 31P NMR, were the increase in COOH and the decrease in guaiacyl OH units in respect to oxidation time. Condensed phenolic hy-droxyl units were found to be stable un-der oxygen delignification conditions. No clear trends were observed in the case of catechols or p-hydroxyphenylpropane units. The overall carboxylic acid group formation followed a similar progressive trend as the small aliphatic carboxylic ac-ids in the reaction solution, as quantified by CE, in respect to reaction time. Anal-ysis of non-volatile acids, co-precipitat-ed with lignin, revealed that 30% (5 min of delignification time) – 60% (10 min) – 95% (20 - 240 min) of COOH units con-sisted of these non-volatile small molec-ular acids and only a minor portion was

bound to the polymeric lignin. Oxalic acid was the most abundant non-volatile ac-id component; other identified acids were glycolic, fumaric, succinic, malonic, malic, and maleic acids.

The 31P NMR results for HW lignin showed a progressive decrease in syrin-gyl and guaiacyl units, while the number of COOH groups increased similarly to the SW case. Comparison with the starting material showed that the total amount of aromatic units fell below that of HW lig-nin after 20 minutes of oxidation. Dur-ing oxygen delignification, the amount of α-carbonyl structures was increased and, simultaneously, the proportion of conju-gated structures (Cα=Cβ) was decreased, when analyzed by pyrolysis-GC/MS.

Peculiar behaviour was observed in the development of conjugated pheno-lic lignin: during the first 20 to 30 min-utes more conjugated phenolic lignin was formed, after which its amount de-creased. In milder conditions, the amount of conjugated phenols was slightly high-er at the end of the reaction period (240 min) than at the beginning, but in harsh-er conditions a lower level was attained.

Based on literature sources, a high-ly detailed lignin degradation scheme ac-counting for most of the experimental ob-servations could be developed. The re-actions start when phenolates, i.e. dis-sociated phenolic lignin units, react with molecular oxygen. The resulting phenoxy radical reacts with superoxide anion (an electron reduction product of oxygen) through paths which result in the release of methanol and the formation of formic acid, oxalic acid, malonic acid, carbon di-oxide, hydrogen ions, hydrogen peroxide and acids, which are still attached to the lignin macromolecule. The production of hydrogen ions decreases pH. Decreas-ing pH in turn lowers the degree of dis-sociation of phenols (phenolate concen-tration) retarding the first reaction step.

Page 57: Forestcluster EffTech programme report

57

Oxygen is consumed in the first reaction step (phenolate + oxygen), but also lat-er when the partly oxidized lignin struc-tures are further degraded. The hydrogen peroxide formed as a side product under-goes, to a major part, catalytic decompo-sition into hydroxyl radicals and superox-ide. The dissociated counterparts of hy-droxyl radicals, oxyl anion radicals, are responsible for forming conjugated phe-nols. These structures have a lower re-activity than non-conjugated phenols, and it seems that at the end of the oxi-dation process the majority of non-con-densed phenols are conjugated. Degra-dation of the conjugated structures leads to formation of glycolic acid and acetic ac-id. The condensed phenol concentration did not change significantly during the ex-periments. In the model, the degradation rate for condensed phenol is lower than that of non-condensed phenol, and dur-ing the course of the experiments its for-mation (via phenoxy radical coupling) ex-ceeds the effect of degradation.

In addition to lignin degradation re-actions, reactions were defined for car-bohydrates, extractives, inorganic sul-phur compounds and inorganic oxygen species. The reaction stoichiometries and kinetic parameters of the inorganic reac-tions were adopted from literature. The rate parameters for transition metal cat-alyzed hydrogen peroxide decomposition were obtained experimentally. The kinet-ics of extractives degradation was based on the observation that roughly half of the extractives react during oxygen del-ignification. Carbohydrate degradation schemes (also discussed in 4.3) were al-so based on literature. Parameter tun-ing was carried out based on literature data and the alkaline extraction results obtained in this project. Reactions of in-organic sulphur compounds, extractives and carbohydrates consume alkali, thus retarding the lignin reactions. Degrada-

tion of carbohydrates lowers yield and in-trinsic viscosity.

The effect of COD load (consisting of lignin, extractives and inorganic sulphur compounds) on carbohydrate degradation can currently be well predicted, but more work is required to validate the modelling of the effect of alkali charge on carbohy-drate degradation.

Work was carried out to determine the antioxidant efficiency of phenols in the oxidation of sugar enediols by oxy-gen. Phenols are converted to phenoxy radicals upon reaction with the initially formed sugar enedioxy radicals, thus pre-venting the oxidation of sugars. The phe-noxy radicals convert part of the initially formed superoxide back to oxygen while they are reduced to phenols. Another part of the phenoxy radicals is oxidized per-manently by superoxide and/or oxygen. Measuring the antioxidant efficiency pro-vides information on the relative reaction rates of the alternative routes involved. The antioxidant efficiency was evaluated by measuring the oxygen consumption for the oxidation of model compounds (glu-cose and dihydroxyacetone). Measure-ments were conducted with varied sugar and phenol concentrations at two temper-atures (25°C and 35°C). Reaction rates were calculated based on the oxygen con-sumption and ratios between approximat-ed reaction rate constants. The oxidation rate for glucose remained close to con-stant until the oxygen was consumed, whereas the oxidation of dihydroxyace-tone slowed. It was discovered that ox-idation of dihydroxyacetone was propor-tional to the square of sugar concentra-tion. The proposed reaction scheme was in accordance with the experimental re-sults.

Page 58: Forestcluster EffTech programme report

58

4.2 Brightness model developmentThe main tasks in developing the pulp brightness model were to:1. Identify the most important

chromophoric structures in pulp and their contribution to pulp colour

2. Define a reaction scheme for the proposed chromophoric structures in the O, D, EXX stages, and their effect on brightness development

• Distinguish between the brightness increase achieved through actual chromophore reactions and through delignification

• Evaluate the role of carbohydrate chromophores on pulp brightness

3. Characterize the main fibre wall chromophores and, if necessary, develop analytical tools for quantitative detection of specific chromophoric structural units

Throughout the project, a substan-tial amount of experimental work was conducted to identify the most impor-tant chromophoric structures. An exten-sive literature study was also conduct-ed. Aqueous solutions of various quino-noid structures (o-quinones, hydroxylated quinones, stilbene quinones) were found to produce very high molar absorptivities at λ = 457 nm (UV-VIS measurements). Catechols and hydroquinones are colour-less at neutral pH, but emit intense colour under alkaline conditions (in the presence of atmospheric oxygen). The same ap-plies to p-quinones which in neutral con-ditions have only a pale yellow colour.

Model compound studies of simple phenolic model compounds showed no effect of ionization on lignin colour, al-though UV-VIS measurements with kraft lignin suggested that ionization of some (phenolic) lignin structures has a revers-

ible effect on lignin colour. A higher de-gree of conjugation is probably possible in lignin due to its macromolecular struc-ture. With polyphenol model compounds (catechol, hydroquinone, pyrogallol), the effect of ionization could not be distin-guished from the other alkali induced re-actions. In alkaline conditions, several reactions (ionization, oxidation, hydrox-ylation, polymerization) take place with these compounds, leading to formation of more coloured structures. In acidic con-ditions only p-quinones were found to re-act further to form more coloured struc-tures, although the conversion was mi-nor compared to that occurring in alka-line conditions.

Impregnation studies were performed to convert the molar absorptivities (ε) of quinones into structure-specific light ab-sorption coefficients (k457), which is the parameter used in the brightness mod-el. The impregnation studies also showed that the brightness drop caused by 10 mmol/kg of o-quinone was over 30 per-centage units when a base paper with a scattering coefficient of 40 m2/kg was used. A similar amount of p-quinone re-sulted in a brightness loss of only 2 per-centage units. Thereby, in softwood pulp with a typically lower scattering coeffi-cient (~30 m2/kg), reaching the max-imum brightness close to 89–90% re-quires an o-quinone content of 0.1 mmol/kg or less (assuming there are no other chromophoric structures present, which is not the case in actual pulp). To reach brightness levels of over 80%, the o-qui-none content must be lower than 1 mmol/kg, whereas p-quinone contents as high as 20 mmol/kg are tolerated in fully bleached pulps.

The model compound results were used as the basis for defining the chro-mophoric structures for the model. O-quinone and hydroxyquinone structures were defined as the main chromophores.

Page 59: Forestcluster EffTech programme report

59

Hydro-quinones, catechols, p-quinones, and benzenetriols (polyphenols) are the main leuco-chromophores, i.e. colourless structures, which can react to form co-lourful products. Information from lite- rature was used to define the reaction paths that generate and destroy these structures.

To evaluate the possible role of car-bohydrate chromophores on pulp bright-ness, lignin-free cotton linter was treat-ed in typical kraft cooking conditions. This resulted in a brightness decrease of 3 to 4 percentage units, confirming that co-loured structures were formed also in polysaccharides. However, compared to lignin, the effect was minor. According to the bleaching experiments, the carbohy-drate chromophores could not be com-pletely removed by chlorine dioxide, but were effectively destroyed by peroxide. It was considered unlikely that the carbo-hydrate chromophores generated during kraft cooking could survive bleaching, and hence only lignin derived chromophores were included in the model.

A series of pulp bleaching experi-ments was conducted to produce valida-tion material for the chromophore reac-tion scheme (and at a more general level for the brightness model). The aim was to distinguish between brightening through delignification and brightening through actual chromophore reactions. The main emphasis was on determining the effect of alkali, peroxide and oxygen at the E, EP, EO and EOP stages. The role of pH, re-tention time, chemical charge and tem-perature were also considered. For the bleaching experiments, mill pulps from several stages (O, D0, EOP and D1) of a 4-stage bleaching line were taken.

The results of the E stage experi-ments confirmed that lignin darkening takes place during alkaline extraction, even though pulp brightness increases due to lignin removal. The darkening oc-

curred rapidly and the majority of new chromophoric structures were formed during first 5 minutes. The formation of coloured chromophores was enhanced by temperature and alkali charge, al-though no additional chromophore for-mation was observed with excess alkali charges. Surprisingly, less new chromo-phores were formed in the presence of oxygen, and the alkaline darkening re-tarding effect of oxygen became clear-er with increasing retention times. This could be related to the in situ formation of peroxide in the presence of oxygen. As expected, the lignin became bright-er with increasing peroxide charge and retention time.

In the following D1 stage, pulp bright-ness increases due to lignin removal but also as a result of the lignin chromophore reactions. With model compound experi-ments is was confirmed that the coloured hydroxylated quinonoid structures formed in the alkaline conditions are reactive with chlorine dioxide, and thus contribute to brightening in the D1 stage. The forma-tion of new reactive phenolic units in qui-nones through hydroxylation in alkaline conditions has also been suggested pre-viously in literature for enhancing reactiv-ity in the following D stage, but this has not been confirmed until now. The high-est bleaching response was achieved in experiments with final pH of 4, which was optimal both in respect to the delignifica-tion and chromophore reactions.

Throughout the project, analytical methods were developed to quantitative-ly analyze and follow the reactions of spe-cific chromophoric structures in different bleaching conditions. This was the great-est challenge in the brightness model de-velopment. Due to the very low amounts of chromophoric structures, most meth-ods tested are not sufficiently sensitive (pyrolysis, 31P NMR), and those having high sensitivity (UV-VIS) are not specific

Page 60: Forestcluster EffTech programme report

60

enough for detailed structures.A new method for quantitative deter-

mination of quinones in pulp was devel-oped. Phenazine derivatization was per-formed to increase the sensitivity of the method by increasing absorptivity and shifting the absorption maxima to high-er wavelengths. The UV-VIS reflectance spectra were recorded from the phen-azine-derived pulp handsheets and Kubel-ka-Munk equation was applied to the con-version of the spectrum into an absor-bance (k/s) spectrum, which is linearly proportional to concentration. The qui-none determination method was applied to a series of unbleached, semi- and ful-ly bleached softwood kraft pulp samples. The results were logical with respect to existing knowledge regarding the reac-tions of quinones in bleaching.

On the basis of the known effect of quinone structures on pulp brightness, the UV-VIS reflectance measurements showed that, besides o-quinones, oth-er chromophores also contribute to the brightness of E stage pulps. This may be due to yet unidentified chromophoric structures formed in the alkaline stages.

Pyrolysis was also tested as a meth-od for detecting chromophoric struc-tures directly from the pulp. Unfortu-nately, no quinonoid structures could be detected from pulps or from isolated residual lignin. Only hydroquinones and catechols were detected. Even though being non-coloured as such, they have been shown to react further in alkaline conditions, forming new unidentified chromophoric structures, and thus are important structures to monitor. In or-der to be able to identify alkali-induced chromophoric structures, hydroqui-none and p-quinone model compounds were treated with alkali and the reac-tion products were analyzed by pyroly-sis. After alkaline treatment, hydroqui-nones, p-quinones, and various pheno-

lic structures were detected in different proportions for both model compounds. Besides quinones, all of the above are also found in pulps. It is, however, very difficult to distinguish their actual ori-gin. A 31P NMR spectroscopic method for the detection of quinonoid structures was also implemented, but no quinones could be detected from isolated lignins with this method either.

4.2.1 Brightness model principles and validation

The molecular level chromophore chemistry is linked to pulp brightness in the following way:• Lignin structures are divided into

three categories according to their light absorption properties:

1. Chromophores, i.e. structures with high molar

absorption coefficient (k457,i), absorption coefficient values defined according to model compound experiments.

Structures in this category: hydroxyquinone, o-quinone

2. General lignin structures with relatively low molar light absorption coefficient value; the absorption coefficient value is optimized during the model validation process. Structures in this category: phenolic lignin, non-phenolic lignin, leuco-chromophoric

3. Highly fragmented lignin derivatives with no contribution to pulp colour

• The colour contribution of each lignin species is proportional to its molar light absorption coefficient (k457,i) and its concentration in the fibre wall (ci). The pulp absorption coefficient (k457) is obtained by summing up the colour contribution of each species k457=Σ (k457,i· ci).

Page 61: Forestcluster EffTech programme report

61

Pulp brightness is computed from the pulp absorption coefficient (k457) and light scattering coefficient (s457) using the Kubelka-Munk equation.

Pulp brightness develops as a result of delignification as well as through chro-mophore generation and destruction. Both processes take place according to the molecular level chemistry models im-plemented in the simulator.

The brightness model, and the chro-mophore reaction schemes it is based on, was validated with the experimental re-sults obtained in bleaching experiments. The aqueous lignin oxidation results ob-tained (4.1) were also exploited in the validation procedure. Although the main target was to optimize the chromophore reaction parameters, it was just as es-sential to validate and optimize the mod-el parameters with respect to predicting the other main attributes (kappa num-ber, AOX, OX, residual chemicals, visco-sity, etc.).

The model was found to be capable of effectively reproducing the experimen-tal brightness development in the E, EP, and EOP stages. The extent of delignifi-cation (kappa number reduction), perox-ide consumption, residual alkali, and AOX release were also rather well predicted. The release of organic material (moni-tored as COD and TOC of the liquor) was consistently underestimated. This may result from the fact that the model as-sumes the delignification process to dis-solve only lignin. Experimental work, in contrast, suggests that the dissolved high molecular weight material is not pure lig-nin but contains a fair degree of carbohy-drates. The use of kappa number as the measure for fibre lignin content may also contribute to the inaccuracy of predicted TOC and COD. Highly oxidized (fully sat-urated) lignin structures are ‘invisible’ in

kappa number analysis, but contribute to TOC and COD when released in the liquor.

The reinforcing effect of oxygen in EO bleaching was underestimated. The pre-dicted improvements in brightness gain and kappa reduction, as compared to a plain E stage, fell severely short. It was hypothesized that oxygen would degrade a significant proportion of the phenols/phenolates created in lignin under alka-line conditions and thereby cause the im-proved decrease in kappa number. The reactions of molecular oxygen would lead to formation of hydrogen peroxide as a product, which would promote chromo-phore destruction. These reactions were implemented in the model, but the pa-rameters could not be adjusted to a level that would have achieved the experimen-tally observed kappa reduction or bright-ness gain. An apparent conclusion is that the alkaline treatment did not produce enough substrates for oxygen. The devel-opment of liquid absorbance in the aque-ous lignin oxidation simulations was, how-ever, rather well line with the results ob-tained in the lignin oxidation experiments. Resolving the EO stage incoherence re-quires better understanding of the struc-tural changes taking place in lignin dur-ing alkaline extraction.

The brightness gain achieved in the D0 and D1 stages was predicted ade-quately. The experimentally observed ef-fect of (final) pH on D1 brightness and kappa reduction could also be repro-duced. The brightness increase obtained with chlorine dioxide is a complex result of chromophore chemistry as well as del-ignification.

The individual bleaching stage valida-tions proved that the simulator predicts the pulp brightness behaviour within a single stage reasonably well. The perfor-mance in predicting brightness develop-ment in a multi-stage sequence was eval-uated by simulating the following three

Page 62: Forestcluster EffTech programme report

62

bleaching series: O-D0-E (SW), D0-EOP-D1 (HW), and D1-N-D2 (HW). The O-D0-E simulations could be evaluated with a concise set of experimental results (D.Sc. thesis of Janne Laine, TKK, 1996). Over-all, the results were credible. A major dif-ficulty occurred in defining the initial com-position of the fibre wall constituents. The simulation outcome is, obviously, quite sensitive to the chemical composition of the fibre wall. Exploiting non-structure-specific variables such as pulp kappa number, brightness, or liquor COD in de-fining molecular scale composition is pre-carious. Future efforts should therefore be focused on developing reliable analyt-ical methods for characterization of the fibre wall.

The chromophore reaction scheme is presently fairly simple, which may be con-sidered both an advantage and a weak-ness. More elaborate schemes could pro-vide better coherence between the pre-dictions and experimental results, but de-fining new structures and reactions with no means of validation goes against the modelling philosophy used in this project.

4.3 Intrinsic viscosity model developmentIntrinsic viscosity is an indirect indicator of average cellulose chain length. A mod-el correlating the intrinsic viscosity mea-surement and degree of polymerization of cellulose chains was adapted from litera-ture. The model also takes into account the fact that hemicelluloses, which gen-erally have a much shorter polymer chain than cellulose, decrease the average de-gree of polymerization.

Cellulose and xylan chains were mod-elled as consisting of various units with different reactivities. Hypochlorous acid as well as hydroxyl and oxyl anion radi-cals are capable of oxidizing the hydrox-yl groups in glucose/xylose units in the

middle of chains into carbonyl groups. Under alkaline conditions the glycosid-ic bond next to the carbonyl containing unit cleaves. As a result, two new chain ends are formed: a reducing end and a non-reducing carbonyl group containing end. Under alkaline conditions the re-ducing end enolizes, which leads to a se-quence of reactions involving alkali, oxy-gen and hydroperoxide anion consump-tion. The reducing end units degrade in-to various soluble acids (formic, glycolic, 3,4-dihydroxybutyric, isosaccharinic, xy-loisosaccharinic, 2-deoxyglyceric acid and carbonate) until the reducing end unit is oxidized into a carboxylic acid form. Ex-periments conducted with chlorous acid showed that chlorous acid can also oxi-dize the reducing ends in carbohydrates into carboxylic acids.

The reaction parameters for this re-action were determined and incorporated into the chlorine dioxide chemistry library. The average degree of polymerization is computed from the concentration data of the various end and middle group units in the carbohydrate chains. This value is converted into intrinsic viscosity with the model obtained from literature.

The viscosity model was tested along with validation of the carbohydrate degra-dation scheme using literature data from oxygen delignification experiments as well as with the alkaline extractions results obtained in this project. As discussed in 4.1, the validation work regarding carbo-hydrate chemistry could not be complet-ed during this project.

4.4 Completing the chlorine dioxide chemistry libraryThe previous chlorine dioxide model un-derestimated chlorous acid (Cl(III)) con-sumption. It was found that aldehydes are oxidized by chlorous acid. The reaction

Page 63: Forestcluster EffTech programme report

63

rates of several small molecular weight aldehydes with Cl(III) were determined and they were found to be high enough to compete with other reactions known to consume Cl(III) during bleaching. It was also discovered that during chlorine diox-ide bleaching new aldehyde groups are formed in carbohydrates and in lignin. These reactions were implemented in the reaction library to increase the accuracy of the model. Efforts were also allocated to refining the theory of chlorate forma-tion. The hypothesized reaction was found not to occur during bleaching.

5. Future plans and key development needs

The models will be applied and developed further under the EffFibre programme (2010-2013) of Forestcluster Ltd. Besides the bleaching simulation, new models will be created to simulate the cooking/biore-fining of wood chips. There are also plans to develop a graphical user interface for the simulation tool.

6. Exploitation plan and impact of results

The end product of the project is a simu-lation tool that can be used in fast and cost effective development of next-gen-eration bleaching processes that are characterized by low capital investments (compact sequences), low operation costs (reduced use of chemicals, water and en-ergy) and low environmental load (BOD, AOX, chlorate). Examples of successful utilization of the first version of the tool already exist. A new three-stage bleach-ing sequence was developed with the fol-lowing additional advantages:

• Water cycles can be closed further • Less chlorine dioxide is consumed • Less AOX is formed • Chlorate content of treated effluent

is reduced.

Conducting simulations prior to the experiments helps to focus laboratory work. Further simulations reveal areas that are insufficiently known and require more research.

The simulation model has been uti-lized in seven Master’s thesis studies to date.

The model will also be utilized in the industrial consortium project Minimum Impact Hardwood Pulp Bleaching (2010–2013).

Page 64: Forestcluster EffTech programme report

64

7. Publications and reportsBjörk, C., Antioxidant effects of phenols in the oxidation of reducing sugars by ox-ygen, Master’s thesis 2010, Aalto Univer-sity.

Kalliola, A., Kuitunen, S., Liitiä, T., Rovio, S., Ohra-aho, T., Tamminen, T., Vuorinen, T., Lignin oxidation mech-anisms under oxygen delignification con-ditions - Results from direct analyses, EWLP 2010, 16.-19.8.2010 Hamburg, Germany.

Kuitunen, S., Kalliola, A., Tarvo, V., Tamminen, T., Rovio, S., Liitiä, T., Ohra-aho, T., Lehtimaa, T., Vuorinen, T., Alopaeus, V., Lignin oxidation mech-anisms under oxygen delignification con-ditions - Reaction pathways and model-ing, EWLP 2010, 16.–19.8.2010 Ham-burg, Germany.

Lehtimaa, T., Tarvo, V., Kuitunen, S., Jääskeläinen, A.-S., Vuorinen, T., The effect of process variables in chlorine di-oxide prebleaching of birch kraft pulp. Part 1. Inorganic chlorine compounds, kappa number, lignin and hexenuron-ic acid content, J. Wood Chem. Technol. (2010), 30(1), 1–18.

Lehtimaa, T., Tarvo, V., Kuitunen, S., Jääskeläinen, A.-S., Vuorinen, T., The effect of process variables in chlorine di-oxide prebleaching of birch kraft pulp. Part 2. AOX and OX formation, J. Wood Chem. Technol. (2010), 30(1), 19–30.

Lehtimaa, T., Kuitunen, S., Tarvo, V., Vuo-rinen, T., Kinetics of aldehyde ox-idation by chlorous acid, Industrial and Engineering Chemistry Research (2010), 49(6), 2688–2693.

Lehtimaa, T., Kuitunen, S., Tarvo, V., Vuo-rinen, T., Reactions of aldehydes with Cl(III) in Chlorine Dioxide Bleaching, Holzforschung (2010), 64(5), 555–561.

Lehtimaa, T., Reactions of chlorine(III) and their kinetics in the chlorine dioxide bleaching of kraft pulps, Doctoral disser-tation 2010, Aalto University.

Pääkkönen, T., Jääskeläinen, A.-S., Tamminen, T., Liitiä, T., Determination of quinones in pulp by UV/VIS reflectance spectroscopy, EWLP 2010, 16.–19.8.2010 Hamburg, Germany.

Page 65: Forestcluster EffTech programme report

65

Pääkkönen, T., Quantification of qui-nones in wood pulps, Master’s thesis 2010, Aalto University.

Rovio, S., Kalliola, A., Sirén, H., Tam-minen, T., Determination of the carboxylic acids in acidic and basic process samples by capillary zone electrophoresis, Journal of Chromatography A (2010), 1217(8), 1407–1413.

Rovio, S., Kuitunen, S., Ohra-aho, T., Ala-kurtti, S., Kalliola, A., Liitiä, T., Tamminen, T., Lignin oxidation mech-anisms under oxygen delignification con-ditions – Detailed characterization of the lignin, EWLP 2010, 16.-19.8.2010 Ham-burg, Germany.

Rovio, S., Kalliola, A., Tamminen, T., Advanced methods to interpret lignin 31P NMR spectra, COST Action FP0901, 19.–21.2010, Hamburg, Germany.

Tarvo, V., Lehtimaa, T., Kuitunen, S., Alopaeus, V., Vuorinen, T., Aittamaa, J., The Kinetics and Stoichiometry of the Reaction between Hypochlorous Acid and Chlorous Acid in Mildly Acidic Solutions, Industrial and Engineering Chemistry Re-search (2009), 48(13), 6280–6286.

Tarvo, V., Lehtimaa, T., Kuitunen, S., Alopaeus, V., Vuorinen, T., Aittamaa, J., A model for chlorine dioxide deligni-fication of chemical pulp, Accepted for publication in J. Wood Chem. Technol.

Tarvo, V., Modeling chlorine dioxide bleaching of chemical pulp, Doctoral dis-sertation 2010, Aalto University.

Warsta, E., Vuorinen, T., Pitkanen, M., Addition of bisulphite to lignin α-carbonyl groups: a study on model compounds and lignin-rich pulp, Holzforschung (2009), 63(2), 232–239.

Page 66: Forestcluster EffTech programme report

66

Short pulping

Project Manager

Duration of the project

Project budget

Project partners

VTT Technical Research Centre of Finland

University of Helsinki (UH) Department of Physics Electronics Research Laboratory

Marjo Määttänen, [email protected]

1.6.2008–30.8.2010

EUR 900,000

Role of participating organization

Enhanced cooking efficiency, shortening the bleaching process, development of fibre kappa analysis method, techno-economical and sustainability evaluation of new fibre line.

Novel measurement methods development and instru-mentation, ultrasound processing and characterization of pulp and fibres.

Page 67: Forestcluster EffTech programme report

67

Abstract The main objective of the Short pulping project was to investigate whether partial removal

of hemicelluloses by prehydrolysis improves impregnation and decreases the formation of co-

loured groups in pulp, thus enabling improved pulping efficiency. In order to predict and moni-

tor the effect of process changes on pulp quality, characterization methods for pulp uniformity

and strength properties were developed. The study found that the use of prehydrolysis for the

efficiency enhancement of kraft pulping is not advantageous unless the considerable value po-

tential of utilising the prehydrolysate components (hemicelluloses) is realised. The most valu-

able result of the prehydrolysis study was the understanding that connecting biorefinery stages

to kraft processes requires extensive exploitation of the whole process chain. The kappa num-

ber variation of the samples can be reliably measured and the pulps can be compared using the

developed sheet measurement based fibre kappa method. The speed of ultrasonic shear waves

in a pulp sheet offers potential as an in-line process analysis measurement for elastic modulus/

tensile index of chemical pulp in pulp and paper mills.

Keywords: prehydrolysis, impregnation, uniformity, ultrasound

Page 68: Forestcluster EffTech programme report

68

1. Project background

During chemical pulping, fibres are ex-posed to numerous chemical and physical treatments under severe conditions. The large number of unit processes involved generates high equipment and ope- rational costs. The forest industry needs to improve its profitability and reduce its capital intensiveness. Although the in-dustry’s unit processes have been con-tinuously modernised more radical inno-vations and improvements are needed. Significant production process simplifica-tion offers great potential to improve both the ecological and economical competi-tiveness of the industry.

After kraft cooking, pulp is inhomo-geneous and dark and requires several bleaching stages. This is a crucial rea-son for the long length of the fibre line. Improved brown stock homogeneity and bleachability are therefore key to achiev-ing more cost effective pulping. The main aim of pulp bleaching is to remove the pulp colour, i.e. the chromophoric struc-tures formed mainly during alkaline cook-ing. Process modifications aimed at mini-mizing colour formation during cooking are therefore also beneficial in respect to final bleaching.

In order to predict and monitor the ef-fects of process changes on pulp quality, characterization methods for pulp unifor-mity and strength properties will be need-ed. The fibre kappa number method will increase knowledge of the impact of pro-cess changes on kappa number variation. Gustafson et al. have developed the Fi-bre Kappa Analysis (FKA) method, which is based on acridine orange staining and fluorescence detection of single fibres us-ing a special device. The goal was to de-velop a method of measuring kappa vari-ation from pulp handsheets using fluores-cence microscopy.

The elastic modulus of the wet and dry paper web is a major factor affect-ing the runnability of a paper machine.

It is affected by the elastic properties and bonding ability of the furnish com-ponents and the web draws of the paper machine. The elastic properties of chemi-cal pulp are affected by changes in the fi-bre wall ultrastructure (e.g. level of hor-nification), degree of fibre deformation (e.g. curl and kinks) and fibre chemistry (e.g. hemicellulose content). Any sudden process change at the pulp mill (chemical dosage, pH, change in raw material etc.) can cause unwanted variation in the elas-tic properties of the pulp. There are cur-rently no measurement methods capable of rapidly determining the elastic proper-ties of pulp in order to control the chem-ical pulping process. Modern ultrasound analysis has shown potential in measur-ing pulp and single fibre strength prop-erties both quickly and accurately. Fibre-level analysis will enhance understanding of the mechanisms behind sheet strength formation (paper and pulp). Such a sheet-level measurement method would provide pulp mills with a valuable tool for fast-er pulp quality control and development.

2. Project objectivesThe overall objective was to develop pulping technology for a compact, flex-ible and economically efficient fibre line by improving the efficiency of kraft cook-ing and bleaching. Intensified kraft cook-ing was carried out by removing chemi-cal (hemicelluloses) and/or physical (pit membranes) obstacles from the chips be-fore cooking by means of prehydrolysis, different impregnation aids and ultrasonic treatment. These pretreatment methods were believed to improve impregnation and decrease energy and chemical con-sumption during cooking. Improved pulp homogeneity and decreased formation of coloured substances, enabling shorter bleaching sequences (2-3 stages) as a re-sult, were also to be expected. The proj-ect also aimed at high kappa level cook-

Page 69: Forestcluster EffTech programme report

69

ing in order to achieve a more economical process with higher pulp yield and low-er energy consumption during pulp re-fining when prehydrolysis was used pri-or cooking.

In order to better understand the ef-fect of process changes on pulp quality, pulp characterization methods were de-veloped. These included a fibre kappa analysis method using pulp handsheets to measure pulp uniformity, as well as an ul-trasonic pulp quality testing method used to evaluate the strength of single fibres and fibre bundles.

3. Research approachThe basic idea in the pulping efficiency improvement study was to use a relative-ly mild prehydrolysis treatment in order to retain a higher hemicellulose content in the pulp than in traditional prehydroly-sis cooks for the production of dissolving pulp. The prehydrolysis research can be divided into three parts:

Evaluation of alternative hydrolysis treatments and the strength of hydroly-sis. The alternatives were water hydroly-sis, acid hydrolysis and alkaline extrac-tion, and the strengths were 4, 8 and 12% yield loss after the hydrolysis stage. The evaluation was carried out by cooking and DEDED bleaching experiments. The outcome was that 12% yield loss was se-lected for the strength of the prehydroly-sis stage, as this had the clearest effect on bleachability.

In the selection of cooking modifica-tion, alternatives for prehydrolysis (steam hydrolysis, water hydrolysis, acid hydro-lysis or alkaline extraction) and neutral-ization stages (no neutralization or neu-tralization with black liquor, white liquor or green liquor) and possibilities to main-tain pulp yield with additives were eval-uated by displacement cooking, ODED bleaching and testing of papermaking properties. The outcome was that steam-

ing based prehydrolysis with a white li-quor neutralization stage gave the high-est chemical savings.

An evaluation was made of the pos-sibilities for minimizing the drawbacks of the prehydrolysis stage on pulping. Pos-sibilities investigated included controlling yield loss and energy consumption during pulp refining by increasing the cooking kappa number and partly replacing oxy-gen with ozone and hydrogen peroxide in the post cooking delignification stage. The outcome was that a higher kappa num-ber and the use of ozone best counteract-ed the disadvantages of the prehydroly-sis stage.

The enhanced impregnation study fo-cused on pre-evaluating how the use of microwave and ultrasound treatments improve the impregnation of wood chips compared to the traditional pre-steaming method. Treatments were carried out with conventional equipment (microwave ov-en and ultrasound bath) not specially tai-lored for the experiments. The research consisted of impregnation and cooking tri-als and the evaluation of the reject con-tent and fibre kappa distribution of the pulps, which are key indicators of pulp homogeneity. The outcome of the study was that microwave treatment is an inter-esting alternative to pre-steaming, but is not economically feasible due to its high energy consumption.

The development of a fibre kappa dis-tribution method consisted of experimen-tal work aimed at developing a repeat-able and accurate method for the analy-sis of pulp homogeneity. Research was fo-cused on repeatable staining, sheet form-ing with minimum overlapping, and im-age/data analyses. The outcome of the study was a repeatable method for mea-suring the relative kappa number varia-tion.

The development of a method for ul-trasound based measurement of strength properties consisted of experiments to show the potential of ultrasonic pulp

Page 70: Forestcluster EffTech programme report

70

sheet measurement as a fast and accu-rate method of determining the elastic/tensile properties of chemical pulp. The aim was also to show the potential of ul-trasonic measurement of fibre bundles for detecting changes in the elastic properties of fibres. The effect of inter-fibre bonding on the results can be reduced by mea-suring fibre bundles. The outcome of the study was that ultrasonic techniques show potential for both applications.

4. Results

4.1 Hemicellulose removal by prehydrolysisThe efficiency enhancement of kraft pulp-ing was investigated by implementing a mild prehydrolysis stage prior to cook-ing. The basic idea was that by remov-ing hemicelluloses (chemical obstacles) from the chips before the impregnation stage, consumption of cooking chemicals and cooking time could be reduced and chip impregnation would be easier and more homogenous due to a more perme-able structure. Improved pulp homogene-ity and decreased formation of coloured substances, thus enabling shorter bleach-ing, was also expected. Research was al-

so focussed on evaluating possibilities for minimizing the observed drawbacks of the prehydrolysis stage on pulping. The find-ings were recognized as being valuable for future biorefinery processes. 4.4.1 Effect on cookingMany of the anticipated benefits of pre-hydrolysis were not realized. Firstly, on-ly the pre-steaming based prehydrolysis stage offered the possibility to improve the impregnation of cooking chemicals through improved penetration. Liquid pre-hydrolysis methods (acid and water) filled the fibre cavities with water, and cook-ing chemicals could therefore impregnate the fibre only by diffusion. The reject con-tent of the prehydrolysis pulps was high-er across the entire kappa number range compared to the reference pulps, indicat-ing poorer impregnation and lower uni-formity. It should be noted that the de-gree of impregnation for the reference cook was assumed to be as high as pos-sible based on present knowledge, al-though such a level is rarely achievable in mill conditions. Secondly, the pH af-ter the prehydrolysis stages was acidic, typically 3.5, leading to partial consump-tion of the alkali charge for neutraliza-tion. The prehydrolysis stage shortened the cooking time at cooking temperature (165 °C), but the total processing time in-cluding prehydrolysis time at prehydroly-sis temperature (160°C) was typically longer. Only when cooking to below kap-pa 25 was the prehydrolysis stage advan-tageous. The yield of prehydrolysis pulps was lower over the entire kappa num-ber range. Increasing the cooking kappa and the use of carbohydrate stabilization (polysulphide cooking, borohydride treat-ment) improved the cooking yield by 1 to 2 percentage points.

4.1.2 BleachabilityThe most promising results were achieved with oxygen delignification and bleaching. Prehydrolysis pulps (S88 Kappa 30, S88

 

Figure 1. Specific alkali consumption of conventional and prehydrolysis pulps in oxygen delignification.

Page 71: Forestcluster EffTech programme report

71

 

Figure 2. Yield and chlorine dioxide saving potential with prehydrolysis pulps.

Kappa 60, S88PS Kappa 30) showed bet-ter reactivity towards oxygen. They had a lower specific alkali consumption than the reference pulps in oxygen delignifica-tion (Figure 1). The final brightness was reached with 5 to 10% lower chlorine di-oxide consumption in the DED sequence when cooked to kappa 30. A lower hexe-nuronic acid content of the prehydrolysis pulps explains about 2 kg of ClO2 as aCl/t of this reduction.

The effect of the cooking kappa num-ber of the prehydrolysis pulp on yield and chemical consumption in bleaching was studied. The results showed that by cook-ing the prehydrolysis pulp to kappa 60 (S88 Kappa 60), a 2 percentage point higher yield was obtained than by cook-ing the prehydrolysis pulp to kappa 30 (S88 Kappa 30). The saving potential in chlorine dioxide consumption was similar for both pulps (Figure 2). By using poly-sulphide in the prehydrolysis cook (S88PS Kappa 30), the yield gain was the same as in cooking the prehydrolysis pulp to kappa 60 (S88 Kappa 60) (Figure 2). The final brightness was reached with 10 kg/t lower chlorine dioxide consumption and a 3 percentage point lower yield compared to the reference pulp cooked to kappa 30 (REF Kappa 30).

When cooking the prehydrolysis pulp

to kappa 20 (S88 Kappa 19 O), similar fi-nal brightness was reached with 15 kg/t lower chlorine dioxide consumption com-pared to the prehydrolysis pulp cooked to kappa 30 (S88 Kappa 30). This was obtained without any additional yield re-duction (Figure 2). Compared to the ref-erence pulp cooked to kappa 30 (REF Kappa 30), chlorine dioxide consumption was 24 kg/t lower but the yield of the bleached pulp was 5 percentage points lower than that of the reference. Howev-er, in the case of the prehydrolysis pulp there is the possibility of attaining a high-er value potential through the utilization of the hemicelluloses component of the prehydrolysate.

4.1.3 Beatability and papermaking propertiesReduced pulp beatability was one of the main effects of prehydrolysis on kraft pulping. The phenomenon is readily ex-plained by the low hemicellulose content of pulps produced by the prehydrolysis kraft process. The type of prehydrolysis (auto or acid hydrolysis) had little effect on beatability, but the extent of the pre-hydrolysis stage (yield) was crucial. Var-ious means of carbohydrate stabilization (polysulphide cooking, borohydride treat-ment) improved pulp beatability only rel-ative to their ability to increase the yield of bleached pulp. The beating revolutions needed to reach a constant tensile index in PFI beating were thus closely related to the yield of bleached pulp (Figure 3).

4.2 Enhanced impregnationImpregnation and cooking experiments were carried out using different impreg-nation methods. The impregnation aids used were ultrasonic (US), microwave (1.5 or 4 min MW), surfactant (Surf), over pressure (5 bar) and pre-steam-ing (S). The most effective methods were pre-steaming and microwave treat-ments. The use of ultrasound treatment gave no benefits over the traditional pre-

Page 72: Forestcluster EffTech programme report

72

steaming method. With the same H-fac-tor, 4 minutes of microwave pretreatment gave a 10–15 lower kappa number than other treatments (Figure 4). The H-fac-tor required for a constant kappa num-ber was about 200 to 400 units lower with screened and knotty chips, respectively. The lower H-factor is probably due to the chemical reactions occurring in wood dur-ing lengthy microwave treatment due to elevated temperature. Based on the re-ject contents, microwave treatment is competitive with pre-steaming. The draw-back of microwave treatment is its high electrical energy consumption. With the used treatment times, 1.5 to 4 min, en-ergy consumption was 150 to 600 kWh/t correspondingly, making the process ec-onomically unattractive.

The positive effect of overpressure on impregnation was seen at high kap-pa numbers (Figure 4) as reduced reject content and reduced kappa number. The effect was not, however, seen together with the pre-steamed reference pulps. Clearly, the pre-steaming time used was sufficient to remove the majority of air and, therefore, the influence of overpres-sure was not evident. During microwav-

ing, part of the air inside the chips was removed along with the water vapour. The longer treatment time, the lower re-sidual air content. A 5 bar overpressure compressed the residual air to one sixth, which improved impregnation.

The methods used to assess pulp ho-mogeneity (fibre kappa distribution) and strength properties (elasticity based on ultrasound velocity in paper sheet and handsheet properties) detected no dif-ference in pulp quality. Microwave treat-ed pulps were as good as the reference pulps.

4.3 Development of fibre kappa distribution methodA new measurement method for pulp ho-mogeneity was developed and tested, based on the same detection principle as in single fibre measurement by the Fiber Kappa Analyzer FKA (Gustafson et al.). The method is based on fluorescence mi-croscopy of acridine orange stained fibres and image analysis directly from hand-sheets (Figure 5). Depending on the fi-bre’s lignin content, individual fibres flu-oresce at specific intensities of green and

 

Figure 3. Beating demand of bleached pulps vs. yield. Reference and prehydrolysis kraft pulps cooked to kappa 30 and ECF bleached to brightness level 88% ISO.

Page 73: Forestcluster EffTech programme report

73

red wavelength. Fibres with a low lignin content show a high intensity peak in the green wavelength region, thus appearing green in the fluorescence image. High lig-nin content fibres have a high intensity peak in the red wavelength region, thus showing as red. The red to green ratio is calculated, and an image analysis pro-gram gives a “kappa number” image ac-cordingly. The image analysis procedure is written as a macro which enables im-ages series to be automatically analyzed.

At the beginning of the project analy-ses were conducted on thick handsheets,

with staining carried out by dipping a piece of handsheet into the staining liq-uid. This caused surface roughening due to fibre swelling, which resulted in micro-scope focusing problems. To eliminate this problem, fibre staining was subsequent-ly performed prior to making the hand-sheets. Using this method, 8 to 12 sam-ples could be prepared during a work day. Thick 60 g/m2 handsheets were initially used for testing, but thinner handsheets (20 g/m2) were subsequently introduced in order to avoid problems caused by overlapping fibres. Fibre overlap increas-es the proportion of middle kappa number fibres, because the test gives an average value for the overlapping fibres.

Finally, very thin sheets (1, 2 and 5 g/m2) were made on a glass substrate to avoid the effect of overlapping fibres. How-ever, the very thin sheets exhibited a high proportion of empty spaces between fibres. Image analysis software was used to elim-inate these gaps and to exclusively mea-sure the fibres. As the number of fibres per image is significantly reduced in very thin sheets, this had to be compensated for by taking more images in order to maintain measurement accuracy. The time-consum-ing imaging process involved reduces the benefit of this method compared to the FKA method. The final tests showed, however, that thin sheets reduced the effect of fi-bre overlap, enabling measurement using this technique, and the thin sheets on glass method ultimately proved to be ideal for microscope analysis.

The staining procedure occasional-ly caused reddish discolouration of the handsheets. The chemistry of staining with acridine orange (AO) is not known. However, the discolouration problem was identified as being caused by the pres-ence of hypochlorite, which was used to counteract the autofluorescence of lig-nin. The final tests were performed with-out hypochlorite based on the assumption that the effect of lignin autofluorescence is negligible.

 

 

Figure 4. The influence of impregnation method on rejects content with screened chips and 10% over-thick chips. Chip pretreatments: S=steaming, US= ultrasound, MW= microwave, surf= surfactant, 5 bar = 5 bars N2 pressure.

Page 74: Forestcluster EffTech programme report

74

The repeatability and reproducibility of the method’s results proved to be rel-atively good. However, it was found that fresh colorant should be used and that the fluorescence imaging should be car-ried out within a few days after staining. Single point calibration using the pulp’s average kappa number and the average red/green fluorescence ratio did not work reliably with the histogram data. The res-caling of the red to green ratio with re-spect to kappa values needs to be based on at least two-point calibration with the measured kappa numbers of the studied pulps (Figure 6). However, different wood species and cooking liquors can result in different responses to staining in terms of red/green fluorescence ratio.” Thus, dif-ferent calibration curves are needed for different kinds of pulps.

The outcome of the study was a re-peatable method for measuring kappa number variation. The developed method has, however, certain limitations. While the kappa number variation of the sam-ples can be reliably measured and the

pulps can be compared with this meth-od, the calibration of red to green ratio to kappa number values is still uncertain and requires further testing.

5. Future plans and key development needs

The most valuable result of the prehy-drolysis study was the understanding that connecting biorefinery stages to kraft pro-cesses requires extensive exploitation of the whole process chain. The kappa num-ber variation of the samples can be reli-ably measured and the pulps can be com-pared using the developed sheet mea-surement based fibre kappa method. The fibre kappa analysis method requires fur-ther testing with respect to the calibration of the red to green ratio to kappa num-ber. The results of the Short pulping proj-ect have been used in preparation of the continuation of EffTech program.

Figure 5. “Kappa” images showing high kappa number fibres as brighter than low kappa number fibres. The left image shows a 20 g/m2 sheet, the right image a 5 g/m2 sheet. The empty spaces between fibres are removed by image analysis.

Page 75: Forestcluster EffTech programme report

75

6. Exploitation plan and impact of results

The value of the prehydrolysis studies de-rives mainly from increased knowledge of the ways in which possible future biorefin-ery pulp mills would influence pulp mak-ing and how the drawbacks of prehydroly-sis can be decreased or even eliminated. The fibre kappa analysis method increas-es knowledge of the influence of process changes on pulp homogeneity, thus lead-ing the pulping process towards better and more uniform pulp quality.

7. Publications and reports

Montonen, R., Mustonen, K., Karppin-en, T., Salmi, A., and Hæggström, E., “Stiffness measurements of Single Plant Fibers with Ultrasound under Humidi-ty Cycling”, IEEE International Ultrason-ics Symposium 2009, Rome, Italy, 23–26.9.2009

Mustonen, K., Quantifying the effect of Fibre damage on pulp sheet strength by ultrasound, 2009, Pro gradu, Department of physics, UH.

Hanhikoski, S., Bleachability of pre-hydrolysed softwood kraft pulps, Puu-23.4030 Pulping Technology Research Project, 2010, Department of Forest Prod-ucts Technology, Aalto University

 

Figure 6. Fibre kappa distribution curves for three pulps with different kappa numbers (18, 30 and 46) and for one sample with mixed pulps (kappa 32), sheet grammage 20 g/m2. The dashed line shows the calculated curve for mixed kappa 18 and 46. The sample with mixed pulp showed a wide kappa number distribution curve. The X-axis values are calibrated using a three-point scaling method based on the measured kappa numbers of the used pulps. However, the correlation with the kappa numbers is not validated.

Page 76: Forestcluster EffTech programme report

76

New process design methodology to reduce capital employed and to improve flexibility (POJo)

Project Manager

Duration of the project

Project budget

Project partners

Tampere University of Technology, Measurement information (TUT)

Aalto University, department of Automation and Systems Technology

University of Jyväskylä, Industrial optimization (UJY)

University of Kuopio, Paper physics (UKU)

VTT, Technical Research Centre of Finland, Process modelling

Risto Ritala, [email protected]

1.6.2008–30.8.2010

EUR 1,200,000

Role of participating organization

Coordinator; measurements, control and decision sup-port as degrees of freedom in production systems de-sign; operational and design optimization

Present and future IT infrastructure in process industries as enabling technologies

Multiobjective optimization, formulation and solvers

Process modelling and model-based optimal operation and design of production systems

Dynamic process modelling of pulp and paper making processes, APROS dynamic simulator

Page 77: Forestcluster EffTech programme report

77

AbstractA general model-based multiobjective and bi-level production system design methodology has

been developed. All process structures and dimensioning are compared (upper level) when

each design is operated optimally (lower level). Three levels of modelling are required: simple

models for operational optimization to predict system behaviour over the short term; models of

intermediate complexity for assessing design performance over the long term (including oper-

ational optimization); and detailed process models for design validation. Multiple objectives at

both the operational and design levels are taken into account. At present, the design provides

the structure and dimensioning of process equipment and the operational policy as scalarization

parameters for operational optimization. The methodology has been tested for the design of fur-

nish management system (tower volumes and operational policies) in SC paper production. The

methods provide a solid basis for production concept development within EffNet programme.

Further development of the methodology will concentrate on expanding the multiobjectiveness

of operation and on dealing with more complex design tasks.

Keywords: design, operation, optimization, multiobjective, furnish preparation, SC paper,

work flow

Page 78: Forestcluster EffTech programme report

78

1. Project background

Heavy capital investment and inflexibility of production systems are the key busi-ness challenges of the chemical forest industry. In this industry, the design of production systems has been considered from the point of view of material equilib-ria at operating points. The design of pro-duction system dynamics has been based on rather coarse studies of production disturbances. The buffering volumes be-tween production departments and mech-anisms for attenuating disturbances have been designed accordingly, but only after the main structures of the production sys-tems have been determined. The control system and operational decision support systems have thus had rather few de-grees of freedom in the design and, as a result, both dynamic behaviour and capi-tal efficiency are suboptimal. Many oper-ations that could otherwise be carried out using information, computational meth-ods and control are instead implement-ed with additional process equipment that carries high investment costs. Further-more, implementing the management of disturbances using separate devices and equipment has led to lengthy transition times between operating points, i.e. poor flexibility.

Chemical engineering methods for integrated control and process structure design have been developed as best prac-tices in the chemical process industries, but have not been applied in pulp and pa-per production. Furthermore, the recent ideas on biorefineries and its integration to conventional pulping and papermaking has opened up opportunities to radically rethink material flows and to seek com-pletely different process structures.

2. Project objectivesThe main goal of the project is to develop a process design methodology that is ca-

pable of fully utilizing the degrees of free-dom provided by measurements, control and operational decision support systems and which thus reduces capital employed and enhances production flexibility. The methodology is based on dynamic model-ling and multiobjective optimization.

The technical goals of the project are to:1. Formulate the design as a

multiobjective optimization problem on a dynamically modelled “super-structure”

• Using approximate solution methods

• Accounting for the role of uncertainty

2. Collect and develop a library of unit operations to be used in model-based analysis of production system concepts

• Using cross-disciplinary models, models of information operations

3. Understand the opportunities of IT and current best practices and, within the next 5–10 years, to transform actions made by unit process equipment into information-based actions

The methodology developed will be demonstrated by a large-scale design case study. The chosen case is the SC pa-per production system, in particular its functions in furnish preparation and wa-ter management.

3. Research approachThe research hypothesis of the POJo proj-ect is that production system design can be specified as a bi-level multiobjective optimization problem where an lower lev-el optimizes the operation of the produc-tion system and an upper level optimiz-es the structure. Furthermore, it was hy-pothesized that such an approach to pro-duction system design can be implement-

Page 79: Forestcluster EffTech programme report

79

ed in practical production system design at the conceptual stage by using appro-priately modified working practices. The approach is expected to be generic, but its practical implementation requires a domain-specific unit process model li-brary.

In order to analyze the validity of the hypotheses, the following research ques-tions have been addressed:• Given a production system design

task, which are the general guidelines for formulating it as a bi-level multiobjective optimization problem?

• Given a typical optimization formulation corresponding to a pulp and paper production system design, how can it be solved within the time frame allowed for conceptual design?

• How radically must working practices in production system design be re-engineered in order to take full benefit of the bi-level multiobjective approach to design?

POJo was initially designed for the en-tire duration of the Efftech problem, i.e. 4 to 5 years, and the research approach has been chosen accordingly. In order to un-derstand the essentials of bi-level multi-objective formulation and how it is solved, a series of design/operational problems of increasing complexity have been specified and analyzed as follows:• Stage 1: Extremely simple generic

systems which nevertheless contain the basic ingredients of the design (operational level: dynamic, stochastic, partially observable, multiobjective; design level: multiobjective, some objective(s) derived from operational level performance); e.g. the three-level system

• Stage 2: Multiobjective operational problems related to papermaking; e.g. multiobjective quality control

and grade changes• Stage 3: Relatively limited scope

design problems related to papermaking; e.g. design of broke systems volume, design of buffer volumes, design of TMP plant capacity and buffer volume.

• Stage 4: Wider scope design problems related to papermaking: design of volumes of furnish/water towers for a single-grade SC paper machine (main case)

In the EffNet programme, which con-tinues the work of EffTech/POJo, the stage 5 problems cover the design of the entire production system, with quality and resource efficiency as the main objectives at both the operational and design levels.

Analysis of the solvability of the de-sign optimization problems led early on to the study of the unit process models at three levels of detail:• Operational optimization models

that are simple and fast to execute and are evaluated analytically rather than simulated

• Design optimization models that are dynamic mass balance models in which quality models are based on furnish composition. The models include operational dynamic optimization. The design optimization models are fast enough to enable hundreds of designs to be assessed in hundreds of operational tasks, which is achieved by simple simulation methods

• Design verification models that consist of detailed simulator models, most notably APROS.

The research approach regarding working practices in design included in-terviews and analysis of current design software tools, on the basis of which plan-ning of working practices using bi-level multiobjective design is being developed in the EffNet project.

Page 80: Forestcluster EffTech programme report

80

4. Results

This chapter is organized as follows: Sec-tions 4.1 to 4.4 discuss the generic PO-Jo approach to model-based multiobjec-tive optimal design. 4.1 discusses the for-mulation for the design task, and 4.2 the implications for work flow in design. 4.3 outlines the required models, and 4.4 dis-cusses the software aspects. Sections 4.5 and 6 present the results from the case studies. In 4.5, the methodology is illus-trated with the main case study on a fur-nish preparation system for SC paper pro-duction. Although the main case is an in-dustrially relevant design task, its analy-sis here is mostly from the point of view of testing the generic methodology. 4.6 presents some smaller cases analyzed during the project which have industri-al relevance.

4.1 Design as a bi-level multiobjective optimization problemThe design of production systems re-quires the comparison of mathematically modelled design alternatives while each of them operates in the best possible way. The comparison of alternative designs and their optimal performance gives rise to a bi-level optimization problem, (Figure 1). Additionally, the comparison of de-signs and the assessment of their perfor-mance are both carried out with respect to multiple conflicting objectives, making the optimization problem multiobjective at each level. To address production sys-tem design, it is therefore necessary to utilize the bi-level multiobjective optimi-zation framework in order to develop a methodology to determine the best pos-sible design.

With respect to modelling, optimal design of a production system involves choosing appropriate design variables, objectives and constraints representing the design problem at each level. From

the optimization point of view, the design problem is then to find optimal values for the design variables with respect to the objectives and subject them to the con-straints. In the bi-level problem, the up-per-level objectives typically model strict-ly design-related aspects, such as invest-ment costs or the type of technology used, while the lower-level objectives are likely to quantify the operational perfor-mance of the design. In particular, evalu-ation of the lower-level design objectives requires an optimal control strategy to be available for each design considered, which justifies the bi-level formulation in which the optimal performance of alter-native designs cannot be evaluated unless

Figure 1. Structure of bi-level design optimization, the lower level. The design is modified based on the performance metrics achieved, i.e. the box below. The variables referred to are from the main case study, see 4.5.

Page 81: Forestcluster EffTech programme report

81

the optimal control strategies are known.The two optimization problems, the

design optimization at the upper level and the operational optimization at the lower level, are combined into a bi-level optimi-zation problem. The upper-level problem is solved for the optimal design structure, while the lower-level problem is a con-trol problem seeking an optimal control to guarantee optimal design performance. The operational optimization problem at the lower level is then parameterized by the values of the design variables, and in the design problem at the upper level optimal control strategies are known for each design as a result of solving the low-er-level problem.

While bi-level optimization problems have been thoroughly studied in recent decades, their multiobjective counterparts with multiple objectives at one or both levels have, for the most part, been de-veloped only in the last decade. Bi-level multiobjective optimization problems are much more complicated than bi-level sin-gle objective problems because the so-lution set of the lower-level multiobjec-tive problem generally consists of infinite-ly many elements. Since this solution set is needed to solve the upper-level multi-objective problem, the overall problem is highly complex and computationally very costly. However, an integrated design and operational optimization problem has cer-tain special features that set it apart from a general bi-level multiobjective optimi-zation problem, and therefore it is likely that increased computational efficiency in solving the problem can be achieved with a tailored solution method.

In the design of a production system with a life span of several decades, un-certainty obviously plays a major role due to its presence at each level. At the up-per level, uncertainties include parame-ters of the business environment such as changes in economic conditions or cus-tomer demands and are taken into ac-count as scenarios. A set of scenarios is

representative of the full range of possi-ble events and is provided by means of explicit scenario-dependent design crite-ria rather than probabilities that might be difficult to quantify. At the lower level, un-certainties are related to the behaviour of production subsystems and components for which probabilities are assumed to be known.

Furthermore, the system under de-sign faces various production tasks that are accounted for in the problem formu-lation in the same way as scenarios. That is, the upper-level design objectives are evaluated separately for each production task and the designer’s preferences are used in place of relative frequencies of oc-currence. A distinction between produc-tion tasks and scenarios is, however, nec-essary because quite often each produc-tion task has to be considered in the con-text of all scenarios.

The multiobjective component of the overall model enables a flexible scenar-io-based approach to handling uncertain-ties at the upper-level, and also allows the designer to learn about the interre-lations and tradeoffs among the optimi-zation objectives during the design pro-cess at each level in order to make knowl-edgeable decisions about the preferred fi-nal design.

The research project proceeded along several lines. During the initial stage, the project performed a thorough study of the state of the art in the subject areas relat-ed to the project including bi-level multi-objective optimization, parametric multi-objective optimization, dynamic multiob-jective optimization, multi-scenario mul-tiobjective optimization, and multiobjec-tive control. We also gained knowledge about the properties of the integrated de-sign and operational optimization prob-lem associated with optimal design of a paper mill.

In the second stage, we proposed two solution approaches to the overall prob-lem, each addressing different aspects of

Page 82: Forestcluster EffTech programme report

82

the problem. We developed a solution ap-proach for a special class of bi-level mul-tiobjective optimization problem which is encountered in the production systems design and which addresses the specif-ic needs of the project. We also devel-oped a method to coordinate the solution of a (single level) multiobjective optimi-zation problem involving multiple scenari-os, which allows the decision maker to as-sess the extent to which each objective is impaired in any particular scenario due to the performance requirements of the oth-er scenarios.

In the third stage of our research we started computational work on solving the lower-level operational optimization prob-lem. Being a control problem, the lower-level problem is solved with model pre-dictive control (MPC). Being also a mul-tiobjective problem, it should be solved with special consideration given to the ob-jectives. During the current stage of our work, the multiple objectives have been scalarized into one, and the tradition-al (single objective) MPC computations have been performed. Once the results of these computations are validated, the next step will involve using a multiobjec-tive MPC approach to assess the tradeoffs between the operational objectives and provide more insight into optimal control at the lower level.

4.2 Work flow of model-based multiobjective designThe essential aspects of the business pro-cess of model-based multiobjective pro-cess design include the actors, process-es, data and supporting IT systems, and the relations between them.

The business process of process de-sign takes place as a networked activity combining actions of several actors with different roles, organizations and physical locations. Nowadays, the main actors in-volved in the process design are the pro-cess designer, the customer and other de-

signers. If an optimizing process design business process is adopted, an addition-al actor, a process optimizer, will be need-ed. The process optimizer may be one or more persons who are able to support op-timization of process designs. The process optimizer needs to cooperate particular-ly with the process designer and possibly also with other actors.

The core processes of process design, as it is nowadays, can be broken down in-to three core activities: design task defini-tion, actual process design and design ac-ceptance. The first activity in the process design work flow is to specify the struc-tural design of the process. The designed aspects of the process include the com-ponents of the process and their proper-ties and connections. The design evolves from initial and draft designs to the final design, with the resulting design typical-ly having a P&I diagram level of accura-cy. During the design process, designs are created and evaluated. The nature of pro-cess structure design is usually iterative to some extent and the structural design activity is complemented with operation-al design. A particular feature of process operation design is that it is dependent on structural design and provides feed-back for it. The structural and operation-al design activities are usually concurrent and progressive, proceeding from aggre-gate designs to detailed designs.

If optimizing process design is adopt-ed, both the new process optimization activity and some developments to the existing process design activities are re-quired. The optimization activity is a con-sulting type of activity. It starts with re-quirements and an existing process de-sign, and produces suggestions for how to make the design more optimal. At the same time, the multiobjective nature of the design problem needs to be noted al-so in the other design activities. The main difference to current work practice is that the design objectives are specified mathe-matically in order to be suitable for math-

Page 83: Forestcluster EffTech programme report

83

ematical optimization. During the design activities, the relations between structur-al and operational designs must be stud-ied carefully, and during the design evalu-ations, both the structural and operational aspects of the design will need to be com-bined. Acceptance of the design should, again, be based on the best balance be-tween the design objectives.

The most important sets of data ma-nipulated during process design can be divided into requirements, design, evalu-ation and resource data. The design da-ta includes two components covering the structure and operation of the process be-ing designed. If optimizing process de-sign is adopted, this needs to be taken into account also in the usage of design data. The multiobjective nature of pref-erence-type requirements must also be noted. The same issue obviously concerns the design evaluations based on these re-quirements. However, the effect of the new design process on the other process design data sets is less obvious.

The most essential IT systems used during the process design include engi-neering design systems and calculation

systems. In addition to these, other types of IT systems are also used, such as doc-ument management systems, spread-sheet calculators and word processors. If optimizing process design is adopted, cer-tain modifications to the IT-systems used for process design will be needed. The op-timization activity will require new calcu-lation tools or, at least, new models to be used with the existing tools. Furthermore, data transfer between the tools used by the process designer and optimizer will be required.

4.3 Three levels of modelling to support design optimizationModels of the studied process play a cen-tral role in the development of the PO-Jo methodology. During the project it has become evident that different types of models are needed for different tasks of the methodology. We have currently iden-tified three levels of model to be applied. These are referred to as prediction, nomi-nal and validation models. All of the mod-els are dynamic, as the goal of the meth-odology is to optimize process dimension-

Figure 2. The three levels of model used in design and operational optimization.

Page 84: Forestcluster EffTech programme report

84

ing and operation. The following figure il-lustrates how these models fit into the methodology framework.

Figure 2 divides the design – including operational optimization – into two phas-es: the optimization itself and the vali-dation of the optimal design. In the fig-ure, y is the information that the optimiz-er receives from a model, i.e. the mea-surement, and u is the control vector that drives the model.

Operational optimization applies the prediction model. The optimization of the operation of the process uses an MPC-type approach. In this approach, vari-ables are projected into the future at each time instant of the optimization. To do this in practice, the prediction mod-els must be computationally efficient as numerous calls are made. For example, in the main case of the POJo project, the prediction model used was a k-step-ahead linear multiple-input, multiple-output transfer function that was derived from a more detailed model by analytic meth-ods and step response tests. The model predicts how paper quality variables and tower volumes evolve as the control vari-ables are manipulated.

The nominal model which is used during the design phase is more detailed than the prediction model and slightly less efficient. The model focuses on the major dynamic features of the process at hand. This allows the model to be simulated with a large time step (e.g. 10 min in the main case). In the POJo main case, the process encompasses the paper machine, stock preparation, broke system and wa-ter system in an SC production line. The main flows and levels of the largest tow-ers are simulated with the nominal mod-el. The optimizer uses the nominal mod-el to evaluate its objective functions us-ing the process measurement variable y. As a result, the optimizer solves the opti-mal control vector u* that is used to con-trol the nominal model for the next time step. By repeating this procedure as time

advances, the control of the process is optimized (Figure 1).

To address the validity of the solution a specific validation model is used, as il-lustrated on the right-hand side of Figure 2. The validation model is used to com-pare how the selected optimization prob-lem formulation and solution perform in a more realistic setting. The prediction model remains the same, but the nominal model is replaced by the validation mod-el. This model describes the same process area as the nominal model, but is more detailed and thus more realistic. For ex-ample, in the POJo main case the valida-tion model’s additional details were im-plemented in such a way that they con-stitute a super-structure. Each additional detail has a parameter indicating wheth-er it is taken into account or not. In the first modification, the filler retention at the PM is allowed to vary. The filler reten-tion variation was modelled simply as ad-ditive, filtered random noise. The second modification was made by allowing quali-ty variation in the flow line from the TMP mill, and random two-state switching be-tween normal and poor quality TMP. The third modification added a new tank to the water system denoting white water tank in addition to the white water tower.

To conduct a validation of optimal de-sign and operation, a set of comparison variables that describes the key features of the process must first be defined. The comparison proceeds as follows: 1. Run simulation experiments to

obtain optimal time series of the comparison variables (denoted by xi,j,nom(t), i = 1,…,Ncomparison

variables and j = 1,…,Nreplications) with the nominal model and the selected optimization method.

2. Run simulation experiments to obtain optimal time series of the comparison variables (xi,j,val(t), i = 1,…, Ncomparison variables and j = 1,…,Nreplications) with the validation model and the selected

Page 85: Forestcluster EffTech programme report

85

optimization method.3. Perform a qualitative analysis

(i.e. visual evaluation of the data) for all time series pairs (variable xi,j,nom(t), variable xi,j,val(t)).

4. Perform time-mean value distribution testing.

5. Perform coefficient of variation distribution testing.

6. Summarise the results of the comparisons.

In steps 1 and 2, several simulation replications with both models are realized. This is done because the process contains stochastic elements (web breaks) and thus a single simulation is not represen-tative. This is also reflected in steps 4 and 5 where statistical comparisons are made. The purpose of step 4 is to see whether the average behaviour of the systems is similar, and step 5 assesses the similari-ties of the systems in terms of their vari-ation. Step 6 involves producing an anal-ysis of the previous steps, including eval-uation of a set of indices quantifying the similarities of the systems.

4.4 Implementing the approach with pre-existing softwareThe simulation-based design and opera-tional optimization has been implement-ed using pre-existing software packag-es. The software packages applied were Matlab and APROS®. The main software-related challenges include the fact that Matlab is not well-suited to implementing the detailed process models required in the validation stage, whereas implement-ing advanced dynamic optimization with-in APROS is rather difficult without com-municating with external software such as Matlab. The required communication adds to the computation time, which in turn slows the design optimization. For this reason, in the typical setup, the optimiza-tion algorithm, the nominal model and the prediction model were run in Matlab only

to achieve a reasonable design optimiza-tion computation time, whereas the vali-dation model was simulated with APROS, either as standalone or as communicat-ing with Matlab for operational optimiza-tion. As the number of validation simula-tions is much less than in design optimi-zation, the validation response time is ac-ceptable with this arrangement.

The connection between APROS and Matlab was realized in two different ways. The first and simplest way involved trans-ferring data between the two softwares using ASCII text files. Matlab wrote the control actions into a text file which AR-POS read and simulated over a predefined time period. During this period APROS logged the selected variables in anoth-er text file that was read by Matlab as the input for the optimization. Although far from efficient, the approach proved to function reliably. Later in the project a more advanced means of connecting Matlab and APROS was implemented us-ing the OPC communication protocol. In this setup, an external OPC client appli-cation was installed in Matlab. This client was used to connect Matlab to the APROS OPC Server. All data and commands were transferred through the OPC channel and no text files were used. The efficiency of this approach was evident.

In both setups Matlab acted as a “master” controlling the flow of informa-tion. APROS received from Matlab new control actions and the command to simu-late. After this, Matlab requested the val-ues of the selected variable from APROS and used them in the optimization.

4.5 Main case: Design optimization applied to paper machine furnish management

4.5.1 Definition of the main case and its variationsThe design task is to specify the volumes of broke, 0-water, clean water, TMP and

Page 86: Forestcluster EffTech programme report

86

chemical pulp towers (and optionally the storage volume of dry broke) and the corresponding operational policy when a web break statistic is given. The goals are to minimize investment costs (tow-er volumes), long-term averages of qual-ity fluctuations (filler content and basis weight), time spent in breaks, and the probability of tower overflows. Three ver-sions of the design were considered: de-sign of broke tower volume (case 0), de-sign of broke tower, 0-water tower and clean water tower volumes (case A) and the overall case (case B). The operation-al policy consists of defining the operation of the following variables (letters in pa-rentheses indicate the cases in which the variable was considered):• broke dosage (0, A, B)• disc filter feed flow (A, B)

• rate of pulping of dry broke (A, B)• fresh water intake (A,B)• recirculation of disc filter output

back to 0-water tower (A, B)• TMP-to-chemical pulp ratio (B)• rate of TMP production (B)• rate of chemical pulp production

(pulping of bales) (B)• TMP tower inflow consistency (B)• chemical pulp tower inflow

consistency (B)

In the following, the operational op-timization of case A will be presented and the design optimization of case 0 will be presented with comments on case A. Case B is a relatively straightforward ex-tension of case A, as the optimization of virgin towers is coupled to optimization in case A only through the usage of dilution

Figure 3. The main case: furnish and water management system of an SC paper production system.

Page 87: Forestcluster EffTech programme report

87

water. See also 4.6.1 for TMP tower opti-mization and 4.6.2 for chemical pulp tow-er optimization.

Web breaks cause large fluctuations in broke tower and 0-water volumes. There-fore modelling of breaks is crucial for the success of design. In all cases the breaks were considered stochastic. In case 0 the onset of a break was directly made a non-linear function of broke dosage, mimick-ing the fact that “vicious circles” of breaks and the need to increase broke dosage are often experienced at mills. In cases A and B a “strength” model was implement-ed and the break probability was made a

nonlinear function of “strength”. The ro-bustness of optimal design with respect to the break model was studied in case 0 (4.5.3).

The nominal model implemented in Matlab was a simple dynamic mass trans-fer model evaluated at 10 minute time steps.

4.5.2 Operational optimizationFigure 4 shows a result of operational op-timization. A nominal model has been run with a fixed design with operational opti-mization on. Operational optimization is a constrained MPC-type problem looking 50

Figure 4. Case A with operational optimization. Time step 10 min, horizontal axis: number of time steps. Two leftmost columns, three rows from top: six volumes (broke, 0-water and clean water towers limited to 3500 m3). Two leftmost columns, two rows from bottom: paper composition (breaks seen as intervals with zero values). Third column: manipulated variables. Fourth column deviation of quality from target (filler target: 0.22, basis weight 6 units, corresponding to a basis weight of 60 g/m2, strength 0.95 units).

Page 88: Forestcluster EffTech programme report

88

time steps ahead. The goal of the oper-ational optimization is to keep the quali-ty parameters (filler, basis weight and strength) at their target values while pre-venting the broke tower, 0-water tower and clean water tower from overflowing or running empty. These objectives form a multiobjective problem that has been scalarized by defining an objective func-tion that is a weighted sum of squared deviations from quality set points and by constraining the probabilities of over-flows/running empty respectively. The scalarization parameters are the degrees of freedom of the design together with the process structure (tower volumes).

4.5.3 Design optimizationThe design optimization considers long-term averages of performance indices in simulations such as depicted in Figure 4, as a function of design degrees of free-dom. Let us consider case 0, in which we have only one design degree of freedom related to the process equipment, i.e. the broke tower volume. The design seeks to• minimize investment cost (tower

volume)• maximize the time till tower

overflow (running the broke tower empty can always be prevented by setting broke dosage to zero)

• minimize long-term average quality variations (filler variations)

• minimize time spent in breaks

The degrees of freedom are the broke tower volume and the operational poli-cy. The operational policy addresses the probability of overflow, filler variations and risk of breaks during the time ho-rizon of operational optimization and is thus multiobjective. The policy has been scalarized with one parameter related to the relative importance of breaks and fill-er variations and with a parameter related to overflow probability. Thus, the design must find values for these parameters.

The design optimization was done

rather by brute force. Figure 5 presents the combinations of investment cost and overflow frequency, and filler variation and proportion of time spent in breaks.

As dealing with numerous objectives is laborious for the designer, the follow-ing approach has been chosen. Firstly, a design candidate was chosen from among the Pareto optimal solutions by considering only the first two objectives.

Figure 5. Performance of design options in case 0. Top: investment cost and inverse of time till overflow. Bottom: filler variation and proportion of time spent in breaks (note: for illustration purposes, an exceptionally high break tendency has been used). The point circled is Pareto optimal and thus the potential design.

Page 89: Forestcluster EffTech programme report

89

Its performance with respect to the other objectives was then assessed (point cir-cled in Figure 5). To improve the design with respect to the latter two objectives, designs around the chosen point have been further refined until a good compro-mise has been achieved (Figure 6). This compromise is characterized by a combi-nation of broke tower volume and opera-tional policy, which are then implement-ed as a result of the design.

The performance characteristics de-pend on the chosen break model. Clear-ly, such break models cannot be very ac-curate. The robustness of the design with respect to the break model must there-

fore be assessed. Table 1 shows the re-sults of such an assessment: three break models have been compared so that the design has assumed one model and the performance has been evaluated with an-other model. According to these results the design is quite robust with respect to overestimation of the break tendency but somewhat sensitive to underestimation.

The design scalarizes the operation-al optimization and can thus be criticized of not leaving operational degrees of f reedom once the design has been implemented. In this sense, the multiob-jective nature at the inner level is lost. One potential way of dealing with this

Figure 6. Left: further refinement of designs around the point chosen in Figure 5. Right: close-up of the figure below left showing the Pareto optimal designs in the break/filler plane. The final design is chosen from amongst these.

Page 90: Forestcluster EffTech programme report

90

problem is to design a set of operation-al policies, each Pareto optimal, and al-low the process operators to choose from amongst them. However, this is an out-standing issue in the methodology devel-opment rather than an issue for the main case.

Case A can be solved in a similar way to that presented above for case 0. For each design, including the three tow-er volumes and scalarization parameters of operational optimization, a few tens of simulations, such as the one in Figure 4, are to be carried out in order to assess the performance of a design with respect to tower volume management, time spent in breaks and quality variations. This data combined with investment cost data then leads to a similar analysis to that pre-sented in Figures 5 and 6. As the design-er chooses between performance alterna-tives rather than design degree of free-dom alternatives, the complexity of the design is not increased radically although the process scope is much wider. Howev-er, producing the performance data will

be much more time consuming and, thus, the brute force solution applied in case 0 must be refined.

The detailed POJo report will present the preliminary results on the case A de-sign optimization. Analysis of the main case, versions A and B, will continue un-der the EffNet WP 9 programme as a test bench for methodology development.

4.6 Further case studies

4.6.1 TMP plant designAt a highly conceptual level, the TMP de-sign problem combines the selection of the number of refiners and the size of storage volume between the TMP plant and the paper machine. Electricity is an important operational cost factor in TMP production. The electricity pricing princi-ples for large TMP producers in Finland can be fairly complicated as it is in the in-terest of power producers for paper pro-ducers to maintain steady consumption. However, in this conceptual design case

Break  probability  (Sr/So)   Overflow  time   Filler  variation   Time  in  breaks  

High/  High   491.64   0.1335   0.2981  High/  Med     467.78   0.1547   0.2931  

High/  Low     427.94   0.1800   0.2866  

Med/  High     895.44   0.1175   0.2323  Med/  Med   900.94   0.1359   0.2250  Med/  Low     896.40   0.1518   0.2199  

Low/  High     3327.9   0.0737   0.1056  Low/  Med     3225.1   0.0840   0.0997  Low/  Low   2650.2   0.0962   0.0989  

 

Table 1. Robustness analysis of a selected design for three break probability functions. The first model is the model with which the performance has been assessed, and the second model is the model assumed in the design.

Page 91: Forestcluster EffTech programme report

91

it is assumed that the energy cost of TMP production at a given moment is inde-pendent of consumption at other time in-stances, and that the instantaneous cost varies over time. Furthermore, it is as-sumed that both the price of electricity and demand for TMP are known without uncertainty 48 hours ahead.

TMP should, naturally, be produced and its storage increased when electrici-ty is at its cheapest and, correspondingly, storage should be decreased and produc-tion reduced or terminated when electric-ity prices peak. This combinatorial optimi-zation problem can be readily solved, for example by applying the Simulated An-nealing algorithm.

The higher the number of refiners and the higher the volume of storage, the more degrees of freedom there are to schedule TMP production. However, these degrees of freedom entail an investment cost. Figure 7 shows the combinations of production and investment costs for a number of designs. For each design oper-ation has been optimized for given short-term price/demand variations. The final selection in conceptual design can be con-sidered as a multiobjective optimization problem between operational and invest-ment costs in a set of price/demand vari-ations, or the objectives can be accumu-lated as an average net present value to be optimized.

4.6.2 Aspects of buffer volume designTowers are used as buffers: if equipment upstream of a buffer comes off stream, production can be continued with the ma-terial in the buffer. For example, in the main case (4.5) the chemical pulp tow-er can be seen as a buffer against pulp-er failure.

The optimal dimension of the buffer is determined by the failure statistics, the value of lost production time and the in-

vestment cost. Failure statistics are de-termined by the mean time between fail-ures and the probability density distribu-tion of duration of failure.

Figure 8 presents the size of the op-timal buffer volume as a function of ex-pected time to recovery for a case in which failures are rare enough to be able to disregard the risk of double failures. The time between failures is exponen-tially distributed and the duration of fail-ure is Gaussian distributed. It is notewor-thy that when the recovery from failure is fast, there is a discontinuous jump from having a finite size buffer volume to hav-ing no buffer at all. This results from that the buffering capacity is prportional to volume, but the cost of volume typical-ly scales as Va, with a roughly 0.6..0.8. Where this discontinuity occurs is a some-what complex function of the mean time between failure, the investment cost co-efficient and the value of lost production time.

Figure 7. Combinations of investment and operational cost achievable for a set of designs (number of refiners, storage volume). For each design, the operation is optimized for a given variation of electricity price and TMP demand. The optimal design is chosen from between the Pareto optimal points.

Page 92: Forestcluster EffTech programme report

92

4.6.3 Operational optimization case studiesAt the operational level, the bi-level mul-tiobjective optimization problem contains an optimizer and a mathematical model. In this study, different solution approach-es have been utilized. When results are required in real time, an efficient solv-er as well as a simplified model is need-ed. On the other hand, if a longer com-putational time can be accepted, a more complex mathematical model can be pre-sented using a dynamic simulator, for ex-ample. The basic idea of the solution pro-cess is that for the given design (and de-sign level) the corresponding operation-al tasks are solved using multiobjective optimization. Because the mathematical models are usually dynamic, this has to be taken into account in the optimization procedure also. In the case studies of the POJo project, dynamics are handled using

the receding horizon prediction principle (model predictive control).

Depending on the solution approach, different optimization methods and mod-elling tools can be used. The main case describes the dynamics of the clean wa-ter tower, 0-water tower, wet broke tow-er, and dry broke storage, see 4.5. In this case study, the modelling is carried out with the aim of obtaining an efficient so-lution process. The computational time is therefore minimised to only a couple of minutes. The aim is to optimize broke system management, and the optimiza-tion problem has been defined in three different ways: as a stochastic optimiza-tion problem, a linear-quadratic-Gauss-ian problem and a deterministic problem.

The operational optimization prob-lems can also be solved using a dynam-ic process simulator. Even though the to-tal computational time is shorter com-pared to the entire bi-level problem, com-putation for this type of study still takes hours. Two case studies have been con-ducted using the dynamic process simu-lator developed by Apros and a differen-tial evolution algorithm. The studies cover retention disturbance and grade change, respectively. In both cases, multiobjective optimization was exploited to improve the performance of PI controllers. The results obtained show that this approach was able to produce smaller disturbance and thereby better stability of paper quality parameters compared to reference simu-lations, as shown in Figure 9.

Based on the promising results of the previous case studies, the next step is to broaden research interest towards handling bi-level optimization problems using a similar approach. This would en-sure more thorough studies of the de-sign, operation, economy and profitabili-ty of the paper mill. The most recent case study includes the design and operation-al levels as well as the use of a dynamic process simulator. Both levels contain a

Figure 8. Optimal buffer size as a function of expected time to recovery (arbitrary units). The different curves represent varying uncertainty regarding the recovery time (red curve: deterministic recovery time; other curves: standard deviation proportional to expected time). The stepwise behaviour derives from the points at which the optimal buffer size has been evaluated.

Page 93: Forestcluster EffTech programme report

93

multiobjective optimization problem and each solution at the design level requires a multiobjective optimization process at the operational level. The solution pro-cess is, consequently, time-consuming with a computational time measured in days. The case study presents the opti-mization of broke towers using optimal broke handling.

5. Future plans and key development needs

The project will continue as work pack-age 9 of the EffNet programme. The work is divided into three parts: methodolog-ical development, solving the most rele-vant application cases as specified by Ef-fNet work package 5, and further analyz-ing the changes in design work flow re-quired by the POJo methodology.

The main methodological issues to be tackled are:• How to retain operational degrees

of freedom in design? The current approach designs the operational optimization by setting universal scalarization parameters for the initially multiobjective operational

task. The opportunity of designing a Pareto optimal region of scalarization parameters and then allowing the operating personnel to choose among these is one avenue to explore.

• How to efficiently solve the problems related to increasing complexity? As discussed above with respect to cases 0 to A of the main case, the main computational issue is the efficient generation of Pareto optimal solutions. The brute force approach used in case 0 can be extended to case A with marginally acceptable computation times, but increasing the complexity further calls for a more structured approach.

The cases to be solved under the Eff-Net programme will no longer be chosen from the point of view of methodology de-velopment, but will be based on industri-al relevance and the production concept development of the entire EffNet pro-gramme.

The work flow analysis has as its goal the drafting of a business plan for the provision of a design service based on POJo methodology. The organization

Figure 9. Results from the retention disturbance and grade change case studies.

Page 94: Forestcluster EffTech programme report

94

of such a service amongst the owners of Forestcluster Ltd and consultancy compa-nies, and the need for setting up start-up companies will be tackled.

The POJo methodology is expected to provide short-term benefits, in particu-lar in the area of operational optimiza-tion. For example, the operation of mate-rial flows, studied in conjunction with the main case, is a task today carried out by process operators without assistance by computational tools. Although operators are highly skilled in these tasks, in com-plex operational situations and in the face of dwindling manpower at mills, opera-tional optimization, such as that present-ed in 4.5.2, is expected to be of high po-tential. The implementation of such tools will be predominantly carried out by com-panies subcontracted by POJo partners.

6. Exploitation plan and impact of results

The project addresses the key econom-ic challenge of the chemical forest indus-try – capital intensiveness – and the re-sults of the project are, as such, essential with respect to the renewal of the indus-try. Furthermore, the design methodology is highly generic and can be applied not only to the design of papermaking sys-tems but also to that of biointegrates. The flexibility and capital intensiveness of bio-integrates is expected to become a major issue as investment in this business area increases. The opportunities provided by POJo will play a central role in develop-ing novel paper production system con-cepts in the EffNet programme, and may also be drawn on in the conceptual design of pulp mills in the EffFibre programme.

Uptake of the project results re-quires active industrial participation and the generation of industry-led application projects. The operational optimization re-

sults of the project, in particular, can al-so be readily applied to existing produc-tion systems. One such industry-led proj-ect, PäTeVä, has already been carried out (11/08-6/10). In addition, an industry-led project closely linked to POJo was con-ducted by one of the owners of Forest-cluster Ltd resulting, for example, in four MSc theses being carried out at Tampere University of Technology (TUT) totalling one man year of MSc researcher work.

Under the EffNet programme, the in-tegration of POJo ideas with the Qvision project’s groundbreaking work in the area of future unit processes looks extremely promising. POJo will provide tools for un-derstanding and optimizing the mill-wide benefits of new unit process technologies.

The main approach of the project, re-placing process equipment with informa-tion-based tools, will also change ways of working in the industry. The transition towards more knowledge-intensive work tasks in such production systems will en-hance the industry’s attractiveness and its ability to employ a more highly skilled workforce.

Page 95: Forestcluster EffTech programme report

95

7. Publications and reportsEskelinen, P., Miettinen, K., “Trade-off analysis approach for interactive nonlin-ear multiobjective optimization”, submit-ted to OR Spectrum.

Eskelinen, P., Ruuska, S., Miettinen, K., Wiecek, M., Mustajoki, J., “A Scena- rioBased Interactive Multiobjective Optimi-zation Method for Decision Making under Uncertainty”, Proceedings of the 25th Mini EURO Conference on Uncertainty and Ro-bustness in Planning and Decision Making (URPDM2010), C.H. Antunes, D.R. Insua, L.C. Dias (Eds.), Coimbra, April 2010.

Linnala, M., “Optimization and simula-tion tools in paper machine concept de-sign”, in Yearbook 2010 in the Interna-tional Doctoral Programme in Pulp and Paper Science and Technology (PaPSaT), Järvelä, H. (Ed.), Espoo, Finland, in press

Linnala, M., Madetoja, E., Ruotsalainen, H., Hämäläinen, J., “Operational optimi-zation as a part of a bi-level optimization problem”, abstract accepted to 24th Euro-pean Conference on Operational Research (EURO XXIV), July 11–14, 2010, Lisbon, Portugal

Linnala, M., Ruotsalainen, H., Ma-detoja, E., Savolainen, J., Hämäläin-en, J., “ Dynamic simulation and optimi-zation of an SC papermaking line – illus-trated with case studies”, Nordic Pulp & Paper Research Journal, 25(2), 2010

Lintunen, J., Optimal operation of a broke system supported with integrated process and control design, MSc Thesis, TUT 2009.

Ropponen, A., Ritala, R., Pistikopou-los, E. N., “Broke management optimi-zation in design of paper production sys-tems”, submitted to ESCAPE-20, June 6–9,2010, Ischia, Italy

Ropponen, A., Ritala, R., Pistikopou-los, E.N., “Optimization issues of the broke management system – the value of the filler content measurement”, Control Systems 2010, September 2010, Stock-holm

Ropponen, A., Ritala, R., Pistikopou-los, E.N., “Optimization issues of the broke management system in papermak-ing”, submitted to Computers and Chemi-cal Engineering

Ruuska, S., Miettinen, K., Wiecek, M.M., “Solution Concepts for Some Bi-level Multiobjective Optimization Prob-lems” , 9th International Conference on Multiple Objective Programming and Goal Programming (MOPGP’10’), May 24–26, 2010, Sousse, Tunisia

Sindhya, K., Haanpää, T., Ruuska, S., Miettinen, K., “Ne“ Mutation Approach for Multiobjective Optimization with Dif-ferential Evolution”, submitted to IEEE Transactions on Evolutionary Computing

Tuomi, A., Application Integration for Condition-Based Maintenance, MSc the-sis, Aalto University School of Science and Technology, Faculty of Electronics, Com-munications and Automation, 2010, 51 p.

Virta, J., Application Integration for Pro-duction Operations Management Using OPC Unified Architecture, MSc thesis, Aal-to University School of Science and Tech-nology, Faculty of Electronics, Communi-cations and Automation, 2010, 59 p.

Virta, J., Seilonen, I., Tuomi, A., Koskinen, K., “SO“-Based Integration for Batch Process Management with OPC UA and ISA-88/95”, 15th IEEE Internation-al Conference on Emerging Technologies and Factory Automation (ETFA 2010), September 13–16, 2010, Bilbao, Spain

Page 96: Forestcluster EffTech programme report

96

Image-based measurement methods for quality in pulping and papermaking (QVision)

Project Manager

Duration of the project

Project budget

Project partners

University of Helsinki, Inverse Problems Research Group

University of Eastern Finland, Color Research Laboratory

University of Jyväskylä, Department of Mathematical Information Technology

University of Jyväskylä, Department of Physics

Lappeenranta University of Technology, FiberLaboratory

Lappeenranta University of Technology, Department of Information Technology

Numerola Oy

University of Oulu, Control Engineering Laboratory

University of Oulu, Measurement and Sensor Laboratory

University of Oulu, Opto-electronics and Measurement Techniques Laboratory

Tampere University of Technology, Institute of Measurement and Information Technology

Heikki Kälviäinen, [email protected]

1.1.2009–30.6.2010

EUR 1,155,000

Role of participating organization

Inversion mathematics, prior information.

Surface chemistry application topics

Information processing hardware and software to produce image maps, 2-D variability analyser platform specification, prototyping and feasibility analysis.

Production of prior information on paper structure based on x-ray tomography, and development of image analysis methods for characterization of paper structure from 3-D tomographic reconstructions.

Expertise in fibre technology, processing equipment, imaging equipment.

Imaging and characterization of paper and print with laboratory measurements. Image processing and analysis methods for process measurements.

Development of optimization algorithms, implementation of (pre-existing) quality models.

Process control and diagnostics.

Testing a polarization microscopic method for measuring dirt particles and fibre deformations from pulp.

Optical coherence tomography of paper.

Measurement and control of MD and CD variations. Trans-mittance as indirect measure of basis weight in control. 2-D adaptive estimation. Imaging measurements of paper structure in laboratory conditions. Roadmap analysis.

Page 97: Forestcluster EffTech programme report

97

AbstractThe QVision project investigates image-based quality measurement methods for pulping and

papermaking. Digital imaging has long been applied in the paper industry for defect detection,

but only recently have bit and position resolutions and the storage of images been technically

developed to enable 10 m wide web imaging and the storage of images at a spatial resolution of

0.1 mm over hundreds of meters of web running at 30 m/s. However, as yet, such image data

has not been used as a means of managing quality or quality uniformity. The QVision project has

proposed and developed image-based measurement methods capable of developing the func-

tional properties of fibre-based products and also controlling them during production at a radi-

cally improved degree of specificity and scope. The methods are a necessity in the development

and production of new high added-value products. The research focused on the development

of a 2-D web-wide variability analyser, high-resolution and rough-resolution structural scanning

measurements, dirt (and other) particle analysis, 2-D control methods and surface chemistry.

Keywords: papermaking, pulping, paper web, image processing and analysis, machine vi-

sion, control, overall quality

Page 98: Forestcluster EffTech programme report

98

1. Project background

This report presents the content and re-sults of the QVision project, which oper-ates under the EffTech programme led by Forestcluster Ltd. The aim of the proj-ect is to investigate image-based quali-ty measurement methods for pulping and papermaking. The project contains 11 re-search partners and the involvement of several industrial partners from Forest-cluster.

Wood is among the few Finnish raw materials with reserves large enough for large-scale production. Despite this wealth and a diversity of end products the forest industry is, due to a number of reasons, in a state of transition. The capability of the Finnish chemical indus-try to produce fibre-based products at a large scale is an important advantage. Al-so, the necessary production plants are available and functional, and the industry has the knowledge to use these resourc-es to convert and refine the raw materi-al into fibre-based products and raw ma-terial for other processes. To adapt to the current challenges related to the raw ma-terial, energy, environment, and global-ization, the forest industry has to reorga-nize. One of the central goals of the in-dustry’s regeneration is to increase the proportion of high-value-added products.

From the business viewpoint, prod-ucts that are demanding to produce will play a key role in this regeneration. In this scenario, it is reasonable to assume that the requirements for these products and their functionality should be consider-ably higher than those applied today. This implies that the measurement technolo-gy currently used as the basis of qual-ity management poses a potential ma-jor limitation. Current quality control is based on a combination of scanning on-line measurements and automated and manual laboratory measurement of sam-ples. These existing practices do not give

an adequate picture of a product’s quality at the resolution that is used in the actual final assessment of the product, such as during visual assessment by the end us-er. Current quality management will thus prove inadequate in most cases in the fu-ture, thus making it a limiting factor in the regeneration process.

Digital imaging and illumination tech-nology have been two of the most rapid-ly developing sectors of electronics during the last decade. Digital photography and the emergence of mobile phone and dig-ital camera hybrids have been the driv-ing forces within the consumer market, and scientific imaging, for example in the health sector, is always hungry for more sensitive technology. Technological ad-vancements in imaging and illumination have also enabled more profound labo-ratory research, development, and qual-ity control of products, as well as indus-trial applications in the control, diagnos-tics, and quality management of manu-facturing processes. Machine vision has been applied in the paper industry for a long time, but imaging and image pro-cessing technology has only recently de-veloped to enable image data capture at a 0.1 mm resolution across an entire paper web moving at 30 m/s. When calibrated and controlled with appropriate laborato-ry measurements, such information en-ables a radically new view of the product in terms of both representation and cov-erage, and can also be used for diagnosis and control of the product itself.

The development of image-based on-line solutions is the ultimate goal of the QVision project. To reach this goal, de-velopments in off-line and laboratory lev-el measurement are also investigated, including new types of enhanced imag-ing such as nanotomography. Such mea-surement techniques enable better un-derstanding of the structure of paper and related phenomena.

Page 99: Forestcluster EffTech programme report

99

2. Project objectives

This chapter considers the objectives and restrictions of the project. The content of the project is described, including rele-vant research tasks and development steps during the research.

QVision focuses on image-based mea-surement and characterization methods related to quality in pulping and paper-making. The project develops methods enabling the development of functional properties of fibre-based products, and also the management of relevant proper-ties in production, at a radically improved degree of specificity and scope. The proj-ect develops entirely new measurements characterizing structures and quality, and a procedure for transferring image-based research and off-line methods to on-line use and for quality management.

The technical objectives of QVision are as follows:• Solutions for imaging measurement

of paper web and processes.

• Methods for image registration and image combinations.

• Characterization of paper properties and other properties related to quality.

• Tools for combining image-based measurements and modelling for innovation of paper structure.

• Framework for overall management of quality.

The scientific objectives of QVision are as follows:• To apply modern digital imaging and

illumination technology.• To analyze large data sets real-time.• To understand image-based

characterization.• To apply inversion methods. • To combine the quality control

concept to data analysis and control methods.

To obtain the set objective, the con-tent of the project is organized as shown

 

T1. Overall

quality

management

with present actuators

actuators

Design of

product structure

Measurement of

structure

Web-wide on-line

measurement

Diagnostics

and control

T5. Innovation of

paper structure

T4. Forming informative

paper structure images

T3. Characterisation

T6. Applying prior information

T2. Combining images

for diagnostics

and control

Figure 1. Content of the QVision project.

Page 100: Forestcluster EffTech programme report

100

in Figure 1. The research has been divid-ed into 6 research tasks.

The project’s research work consists of the following task combinations: • T1&T2: Combining images for

diagnostics and control together with overall management of quality, with a research question of how to produce/use real-time 2-D web-wide image data for on-line control using suitable image resolutions.

• T3&T4: Imaging and characterization of particles and structure, with a research question of how to go from current/new data by current/new measurements to

new features using (multimodal/sensor) imaging and pattern recognition.

• T5&T6: Applying prior information and optimizing paper structure, with a research question of what new tools could be available; using simulation tools for innovating and (inverse) mathematical tools for modelling.

One of the main objectives is to pro-duce new kinds of data for enhanced un-derstanding and analysis of the phenom-ena influencing papermaking quality. An example is shown in Figure 2, which gives

(a)   (b)  

(c)  

 

Figure 2. Paper web variations in MD (machine direction) and CD (cross direction): (a) Model of actual variation; (b) Scanner-based 2-D estimate; (c) Real 2-D data from a multi-camera imaging system.

Page 101: Forestcluster EffTech programme report

101

alternative views of the “natural” varia-tions on a paper web due to the paper structure and the manufacturing process. The real 2-D data are produced using a multi-camera imaging system. The task involves highly demanding on-line mea-surement, since the web runs at 30 me-ters per second and the 2-D web image consists of several images taken over a large area. Another area of interest is the use of other imaging techniques such as spectral imaging, fluorescence, polarized light microscopy, nanotomography, etc. (Figure 3). It is also important to under-stand the relationships between reflec-tance and transmittance images. This en-ables both the paper surface and internal structure, which are both crucial to print-ability, to be analyzed.

The results of the research tasks were then considered from the point of view of possible future application requirements. The goal was to identify key areas for fu-ture research. A ‘from research to appli-cations’ roadmap was then drawn up to-gether with the project’s industrial part-ners, during which each research and in-dustrial partner was asked to propose ap-

plication templates. The application tem-plates were grouped, and the applications with highest potential were chosen either as an application development task or a feasibility study depending on the re-quired stage of research. Each applica-tion development task was led by one re-search and one industry partner, and each feasibility study by one industry partner.

Further development focused on the following application development tasks: a) 2-D web-wide variability analyser, b) high-resolution structural measurements, c) rough-resolution structural measure-ments, and d) particle and fibre analy-sis, and on the following feasibility stud-ies: i) 2-D control methods and ii) sur-face chemistry.

3. Research approachThe work was initiated by technical stud-ies in the six task areas defined in Sec-tion 2. Parallel to this work, the application roadmap was developed and the applica-tions with highest potential were identified. These were further divided into applications

Figure 3. Different imaging modalities.

Page 102: Forestcluster EffTech programme report

102

for which technical development could be formulated as clear tasks (application cas-es) and those for which development re-quired further specification (application fea-sibility studies). Roughly at the mid-project stage, the organization and goal setting of the project was transformed from research tasks into applications.

2-D web-wide variability analys-er: The goal was to acquire image se-quences from full-scale paper machines and to develop methods to correct geo-metric and illumination variations, stitch them together, and analyze them with statistical signal analysis methods. More-over, quality data has been collected ei-ther on-line (Quality Control System, QCS) or off-line with the paper web anal-yser. The quality data was compared with the image data to enable the information on quality contained in the images to be assessed. Meaningful comparison is based on the average cross-directional variation as the image, QCS and web analyser data sets cannot be aligned point-wise.

High-resolution structural mea-surement: Measurements such as 1) sur-face topography measurement, 2) orienta-tion analysis of surface fibres, and 3) pre-dicting mottling intensity, demand a high image resolution (approx. 10 um pixel size). The developed measurements can be applied with scanning devices and in labo-ratories, but not yet with full web images.

Rough-resolution structural mea-surement: Formation, as the spatial ba-sis weight distribution of paper, affects a variety of paper characteristics such as paper strength and printability. It can be partially controlled with available mea-surements and actuators, but the present on-line formation measurements are lim-ited and current control methods do not allow 2-D control. The research focused on 1) approaches for laboratory and on-line optical formation measurement, 2) image-based characterization methods, and 3) possibilities for formation control. The experiments were carried out using

the QVision paper samples and Testaa 1 process data (SUORA environment).

Particle and fibre analysis: Current image analysis systems for particle de-tection are not as reliable as is needed. With new imaging and lighting techniques combined with more efficient image anal-ysis, particle detection could be improved. The same techniques could also be used for fibre analysis and could also enable particle and fibre analysis directly from flowing pulp. The research focused on 1) development of imaging systems for pulp suspensions, 2) image analysis methods for particle detection and 3) implementa-tion of spectral and/or fluorescence imag-ing in particle detection, by using differ-ent types of dry sample sheets and vari-ous types of pulp suspensions.

2-D control methods: Using simu-lated data, different algorithms were pro-grammed in Matlab and connected to the simulation program. The starting point was the existing actuator technology and actuator-to-measurement delay. Simula-tions covered different measurement and disturbance scenarios, and comparison was based mainly on disturbance rejec-tion properties.

Surface chemistry: The research approach involved spectral measure-ment of samples and statistical analysis of measurement results. A principal com-ponent analysis classifier was applied. PCA based classifiers have shown their strength in classifying spectral type data in many studies. The main objective and outcome was the ability to classify differ-ent ASA dosages into different classes.

Innovation of paper structure: The goal was to search for new meth-ods, tools, and practices for paper prod-uct design. The long-term results are in-tended to improve the cost efficiency of product design and to offer new perspec-tives for product development, which has been restricted by the current technolog-ical limitations of paper production pro-cesses. Computational methods and vir-

Page 103: Forestcluster EffTech programme report

103

tual design environments could consider-ably help in customizing products with re-spect to customer needs. Although cur-rent papermaking technology is not yet advanced enough to produce such cus-tomized papers, the setting of well-de-fined targets for future production ma-chinery is of valuable benefit in itself.

4. Results4.1 2-D web-wide variability analyser

The 2-DVarA application study consists of the following three application compo-nents:• Digital “illuminated table” providing

a digital web-wide image extending tens of meters in the machine direction (MD) at submillimetre pixel resolution, enabling the user to freely choose the area of interest and the magnification.

• Medium-scale analyser of web transmittance with a “pixel” resolution of less than 1 mm in the cross direction (CD) and approximately 100 mm in the machine direction (MD), and a transmittance analysis range extending several kilometres in MD.

• Continuous basis weight estimator (servicing 2-D control) based on transmittance measurement at millimetre resolution in CD, and 100 to 5000 mm resolution in MD.

As a basis for method development, data was collected from a full-scale news-print machine. The data consisted of scan-ner data from the quality control system (QCS), paper samples cut across the ma-chine reel and measured with an off-line web analyser (Tapio), and transmittance data collected from a commercial defect detection system (Viconsys, 18 cameras across the machine). The transmittance

data consisted of three elements:• Raw images at full resolution

(0.83x0.83 mm) over 3 s (10 bit)• Jpg images at full resolution

over 20 s• Raw line images with resolution

0.83 (CD) x 105 mm (MD) over 120 s; the line images were constructed

individually at each image processing card as MD averages of single images.The defect detection system has not

been designed for accessing web-wide transmittance maps. In order to produce the digital “illuminated table” application, the following image correction operations need to be carried out:• Geometry and perspective

correction to compensate for non-ideal camera alignment and lens distortion: obtained by using machine directional streaks to identify the correcting homography.

• Positioning the images: obtained by first using information on the camera arrangement to get a rough order and then determining the image overlap with correlation.

• Illumination affects: several approaches; illumination models, subtracting the long-term average image. None of these is fully satisfactory for the intended application range. The illumination models are not very accurate and subtracting long-term images removes stable streaks. The proper means of correcting for illumination non-uniformity is with an illumination calibration image, which is straightforward to obtain during a production break, but was not available during this study.

• Stitching the images to a single transmittance map: obtained with multiresolution splining.

The current version of the digital “il-luminated table” is available on the In-ternet. The user chooses between the

Page 104: Forestcluster EffTech programme report

104

methods for illumination correction ac-cording to the features of interest. Pro-ducing the application computationally is rather slow.

The medium-scale transmittance variability analyser is based on line im-ages over several kilometres. Such im-ages provide novel information about residual variations in the web and, thus, the new analysis methods concentrat-ed on finding features with assignable causes in residual variation. The imag-es revealed strong tilted waves that are most probably due to fast consistency variations entering the headbox; these variations contributed over 70% of the total transmittance variance. A method based on the directed antenna principle was constructed to extract tilted wave amplitude as a function of tilt angle and to reconstruct the signal shape of the cause. The analysis revealed that more than half of the MD variation estimated was actually an alias of the tilted waves. Currently, methods based on feature de-tection, 2-D spectral analysis and prin-ciple component analysis are being con-structed in order to break residual varia-tion down into further components with assignable causes. Variance component analysis has shown that there are fur-ther structures in the residual variation, i.e. once the tilted waves have been re-moved, and that the remaining compo-nent is not white noise. Routine meth-ods for CD and MD analysis can be ap-plied to the data as well. Studying the CD profile of residual variance reveals the camera structure: the method is limited by the deficiencies of the meth-ods for compensation illumination varia-tion. The calibration image for illumina-tion is also expected to further enhance opportunities in this area.

If the transmittance map could be converted into a basis weight estimate at line image resolution, basis weight control would no longer be limited by QCS scanner speed. However, even in

newsprint there are several disturbanc-es to the basis weight vs. transmittance dependency, most notably moisture and filler content variations. To assess the opportunities for fast basis weight esti-mation based on defect detection cam-eras and a scanner, the long-term aver-age profiles from the scanner data, line scan images and paper samples were compared. The correlation is rather poor; with proper CD filtering, a corre-lation coefficient of 0.4 was found. How-ever, the following observations were made:• Within paper samples the

correlation between transmittance and basis weight was up to 0.9, which means that filler content is not the main cause of disturbance.

• The basis weight in the paper samples and in the QCS scanner had a correlation of 0.72. The difference is expected to be at least partly due to moisture variations, and hence the basis weight estimator to be constructed should have moisture compensation based on the QCS scanner.

• The QCS profiles were heavily filtered and not accurately aligned time-wise with the defect detection data or paper samples. The QCS profiles featured somewhat unstable areas close to the edges, perhaps due to the tilted waves.

• The correction for illumination variations in images needs to be improved. There appears to be larger deviation between basis weight profiles and line image data at the edges of individual camera areas.

On the basis of these observations, the development of a basis weight esti-mator that combines scanner measure-ments and transmittance maps should be pursued further. Correcting the illumi-nation and using scanner data to adapt

Page 105: Forestcluster EffTech programme report

105

the model between transmittance, basis weight, and moisture are the next devel-opment steps. Estimation is of particu-lar interest when identifying the CD actu-ator responses: the shape identification is rather uncertain with scanner data on-ly, but could be radically improved with transmittance data, whereas the gain es-timation can be carried out reliably based on the scanner.

4.2 High-resolution structural measurementThe application topic QFine is developing the following three measurement meth-ods:• Surface topography measurement.• Orientation measurement of surface

fibres.• Optical mottling analysis.

Paper samples have been collected in order to compare different methods of pa-per surface measurement and to compare the samples with the surface achieved by tomography. The samples are kept for future measurements and all data mea-sured from the samples is archived to-gether with the samples.

In the area of surface topography, the intention has been to develop the photo-metric stereo based method (also called shape from shading). A new testing ap-paratus for laboratory measurements has been built. The device is based on a ro-tating stand with led lights and a DSLR camera. The device enables testing and analysis of a large number of images with known lighting angles. The surface gradi-ents and albedo can be computed from multi-light measurements. When the sur-face is Lambertian (totally mat), the for-mula simplifies, fitting three parametric linear models to each image pixel . Al-though it is expected that surface gra-dients can be computed more accurate-ly with more images, three images is suf-ficient in principle. Some new ideas have

been applied to the computation of gradi-ents. The surface is interpreted from the gradients. With the device, samples have been measured and different combina-tions of lights have been compared. Initial data analysis results confirm that more lights produce a better surface estimate, but the overall difference is small. The re-sult is positive for on-line measurement development where the number of lights will always be limited. The data analysis is still in the initial stages, but numerous possibilities lay ahead for method devel-opment and for comparison of the results with different paper types.

As regards orientation measurement, different approaches to surface fibre ori-entation analysis using different types of images have been investigated: 1) the traditional image gradients method has been studied and applied, 2) the 2-D spectrum method has been tested, 3) a new method that uses wavelets to char-acterize surface orientation has been de-veloped (a description of this innovation has been submitted by Matti Lassas, Jou-ni Takalo, Jouni Sampo, Samuli Siltanen, and Jussi Timonen), and 4) a new meth-od for orientation measurement using scale invariant feature transform (SIFT) has been tested. It has become clear that the lighting arrangement for imaging is a crucial factor when measuring surface ori-entation. Different lighting arrangements have therefore also been tested along with different computational methods. We have used: 1) transmittance images, 2) dark field lighting, and 3) light series as used in topography measurement. All of these methods have given similar results with the same paper samples, although some lighting-dependent computational tricks can be used. The results indicate that, when used within reason, there is virtually no difference between the light-ing methods.

Work has also been carried out to pro-duce a reliable map of surface orientation distribution. Here, the axial distribution

Page 106: Forestcluster EffTech programme report

106

of the fibre orientation requires conver-sion into a standard-type distribution by doubling the angles. This is because the orientation and anisotropy parameters of standard distributions are compatible with simple filtering analysis, which is need-ed for the formation of orientation maps.

The mottling printing defect is caused by either inconsistent raster dot ink spread or dot gain variation. The latter is an optical phenomenon explained by the distribution of light at the borders of the raster dots where variation can be caused by variation in the paper structure. A pat-ented method exists for the measurement of optical mottling without the use of ink. The patent describes a 50x50µm chess-board image on a transparency. We have tested the printed transparency, but so far no reliable results have been obtained because the method requires very close contact between the transparency and the entire paper surface, which has so far not been achieved. We have also de-vised a projector that is able to produce a 30x30µm chessboard image which should show similar variation effects on the un-even paper structure. The projector has yet to be tested.

4.3 Rough-resolution structural measurementThe QRough application development task focused on providing 1) ideal opti-cal formation measurement for on-line use, 2) knowledge of the connections between optical formation and control-lable variables, 3) profiles of selected characteristics for early warnings for raw materials and process chemistry, and 4) improved possibilities to con-trol and handle problems related to fi-bre raw material and/or chemical state of the process.

The research work during QVision (EffTech) concerning rough-resolution structural measurement produced the fol-lowing results:

• Laboratory imaging experiments of different paper samples with front (reflection) and back illumination (transmission): Experiments with different illumination options showed that frontal dark-field illumination emphasizes the surface details and, depending on the spatial resolution, makes surface orientation analysis possible. The future of rough-resolution characterization of paper lies in combining reflection and transmission data.

• Image pre-processing methods for formation images for laboratory and on-line use: To use camera-based transmission images with global image analysis methods, uneven image illumination has to be corrected. Several methods are available to model the illumination field and correct the uneven illumination. Similarly, the periodic structure of the wire affects the analysis of the transmission images. The periodic signal is commonly removed by using frequency-domain filtering (cf. de-screening of raster print images).

• Image restoration method for optical transmission images: X-ray transmission maps are thought to reveal the true mass density variation in paper. However, x-ray measurements can not be performed on-line, or even off-line for large paper samples. In the study, prior information in the form of an edge spread function was used as the basis for deconvolving the actual variation based on optical measurements.

• Feature-based image registration method: To combine image information from multiple sources with possibly different modalities, image alignment is needed. A registration method based on image

Page 107: Forestcluster EffTech programme report

107

features was developed for coarse and fine image alignment, and it was tested with pulp sheet images with dirt particles.

• Implementations of standard formation characterization methods: The standard ways to characterize the variation include 1st-order statistics such as standard deviation, specific formation, and the coefficient of variation. In addition, the methods arising from image processing, starting from specific perimeter and micro scale, were implemented as the baseline methods for further development.

• Initial experiments with Testaa 1 data (SUORA, 25.-29.8.2008, VTT Jyväskylä): The data consists of process data from Metso DNA and transmission images from a single Viconsys camera. Despite its limitations (averaging of process data and limited synchronization accuracy), the data was used for method development, particularly methods for data synchronization and pre-processing of images.

• Possible approaches to formation control: One direction for the development of formation (and orientation) control is to imitate present scanner-based controls using headbox jet flow and flocculation chemical addition as manipulated variables. Faster variations would be identified using diagnostics methods. More research on using image data and actuator studies are still needed.

4.4 Particle and fibre analysisIt was shown that with a simple imaging system, consisting of a CCD camera and stroboscope light source, fairly good im-ages can be obtained from flowing pulp even at high consistencies. It is clear that with better lighting and a higher resolu-

tion camera much better images could be obtained. Better images are needed for automated particle and fibre detection and analysis.

The image analysis of the polarized light method for dirt counting was devel-oped and largely automated. The meth-od’s remaining bottleneck is image acqui-sition. The samples presently used con-sist of conventional microscope slides. The method needs to be implemented in a flow cell arrangement to make it fast enough to gather statistically significant data.

One part of this study focussed on dirt counting and characterization of dirt particles in pulp sheets. Two segmenta-tion methods, which can adapt to differ-ent backgrounds and uniformity, are pro-posed to overcome the majority of im-age analysis problems of current systems (Panjeh Fouladgaran 2010b). Future work includes field testing of the methods with a more representative set of pulp sam-ples, and the combined use of reflection and transmissions images for feature ex-traction.

The initial tests show that spec-tral and fluorescent imaging could have high potential in particle and fibre analy-sis. Samples from different stages of the pulp mill have significance differences in the first three principal components. It should be studied whether these differ-ences have any correlation, for example, with the progress of pulp bleaching in the different bleaching stages. If so, this in-formation could be utilized in bleaching control. It was also shown that different types of particles can be distinguished by their spectra. This could be a major ben-efit in particle classification.

4.5 2-D control methodsThis feasibility study aims at evaluating the technical feasibility and studying the potential of 2-D control of the paper web based on improved resolution transmit-tance measurement and combining this

Page 108: Forestcluster EffTech programme report

108

with existing scanner measurements. This analysis of improvement potential starts from the existing actuator technol-ogy and actuator-to-measurement delay. The work reported here covers the test-ing of different 2-D control schemes com-bined with different measurement scenar-ios using simulation. Comparison is based mainly on disturbance rejection properties (minimum variance control).

New measurement capabilities based on paper machine fault detection sys-tem images provide virtually continuous estimates also of quality variables such as basis weight. The new measurement technologies also enable observation of cross-direction (CD) actuator dynamics and the control interval to be decreased to seconds. Predictive controllers are re-quired to deal with the dead time. The challenge is to perform control calcula-tions fast enough to meet real-time re-quirements.

In this case, the CD process, i.e. the actuator input-output relationship, is de-scribed with a non-square interaction ma-trix. The number of measurement loca-tions is much higher than the number of actuators. The scanner is assumed to move at a constant speed of 25 cm/s and it takes one additional second to complete the turn at the edge of the web. As the simulated web dimensions in CD and MD are 400 cm and 1000 s, respectively, the total scan time is thus 17 s. There are 67 CD actuators, which are assumed to be dilution valves. The dynamic responses of these actuators are described by a first order plus delay model, in which each ac-tuator is assumed to have the same dy-namic response. In the chosen 2-D con-troller, a linear optimal controller is used to calculate the steady-state control ac-tions and a predictive controller controls a part of the residual variation. The perfor-mance of the 2-D controller is compared to a conventional model predictive con-troller. The implementation is made with Matlab’s Model Predictive Toolbox®.

The 2-D controller worked well in all simulated disturbance scenarios, includ-ing tilted CD profiles and sharp chang-es in cross direction. The 2-D controller performed at least as well in attenuating steady-state disturbances. In the distur-bance scenario with dynamically changing CD profiles, the 2-D controller performed better than the reference controller. The difference between the performances is modest, but statistically significant. The disturbances simulated in this compara-tive study are slowly generating. The dif-ference is emphasized when faster vari-ations are simulated. The simulations show that even with nominal tuning, the 2-D controller can attenuate disturbanc-es over a wide frequency range.

The 2-D controller seems usable, at least in the simulation environment. Fur-ther simulations with more comprehen-sive disturbance scenarios should, how-ever, be made. Most web forming con-trollers use filtering methods that reduce the dimensions of the problem. Such fil-ters will be applied and tested later with the 2-D controller.

4.6 Surface chemistryThe main focus of the surface chemistry task was to study a Kemira case to find a reliable and quick off-line method for ASA application using IR spectroscopy. For these purposes, a sample set of 75 lab-oratory-made handsheets were obtained by Kemira. The sample set contained three different pulp groups of varying fur-nish and filler content, these were: 100% chemical pulp (50% birch + 50% pine), 30% chemical pulp (50% birch + 50% pine) + 70% mechanical pulp, and 30% chemical pulp (50% birch + 50% pine) + 70% mechanical pulp + PCC. Each of the three groups contained 5 different ASA dosages: 0 kg/t, 0.5 kg/t, 1 kg/t, 3 kg/t and 5 kg/t. Five parallel sheets of each type were used (total sheet number 3 x 5 x 5 = 75). Spectral measurements

Page 109: Forestcluster EffTech programme report

109

were made with a scanning type spectro-photometer equipped with an integrating sphere. The measurements were made with 0/d geometry in the wavelength range 200 nm to 2500 nm with a 5 nm interval. All samples were measured on both sides. The results showed the spec-tral shapes to differ between different fur-nish and filler groups, so groups are easi-ly classified with a few wavelengths. Con-versely, the spectral shapes within each furnish and filler class were highly simi-lar irrespective of the ASA dosage. Sta-tistical classification methods are there-fore needed. The PCA classification meth-od was also tested at a number of dif-ferent wavelength ranges. The results of the classification using PCA with subspace dimension 7 are shown in Table 1. Each group was classified separately. A wave-length range from 1600 nm to 2300 nm with a 5 nm interval was used for the classification. Due to the small number of spectra in each class, the classification was made using the leave one out meth-od. According to the method, one spec-trum is classified and the other spectra are used for training the classifier. The re-sults are very promising.

However, further study related to PCA analysis is needed to confirm the optimal dimension of the basis vec-tors and the best wavelength ranges.

Good sizing is essential for certain pa-per and board grades (e.g. liquid pack-aging board). If sizing is not successful, the product may be rendered useless with respect to the intended end prod-uct. The current off-line and laboratory test methods are slow, and thus distur-bances in sizing are not observed early enough. This results in wasted produc-tion and – in the worst cases – custom-er claims. A reliable on-line method for measurement of sizing would thus be of great value. One essential publication was found: Martorana E., Fischer S., Kl-eemann S., Quantitative analysis of syn-thetic sizing agents (ASA/AKD) using NIR spectroscopy,” Nordic Pulp and Pa-per Research Journal 24, 2009.

4.7 Innovation of paper structure Our research goal was to explore novel methods for the design of tailored paper products. During the course of the QVi-sion programme we have completed the specification and demonstration phas-es of our long-term project. We have re-searched and developed design meth-ods which help create new tailored pa-per products optimized with respect to material costs and paper quality. These conflicting demands lead to formulation of multiobjective optimization problems,

 

Chemical  pulp   Chemical  pulp  +  mechanical  pulp  

Chemical  pulp  +  mechanical  pulp  +  PCC  

    C1   C2   C3   C4   C5     C1   C2   C3   C4   C5     C1   C2   C3   C4   C5  

C1     10             10   1           10   1        

C2       9   1           9             9        

C3       1   8             10             10      

C4         1   10             10             10    

C5             10             10             10    

Table 1. Classification results using PCA, subspace dimension 7. C1 to C5 corresponds to ASA dosages 0kg/t, 0.5kg/t, 1kg/t, 3kg/t and 5kg/t, respectively.

Page 110: Forestcluster EffTech programme report

110

which primarily draw upon such paper quality measures as spatial basis weight distribution, tensile strength, bending stiffness, brightness and opacity, which are described by mechanical and optical models as well as by statistical distribu-tions and simulation models.

We report the project’s results in the form of the Paperrin 1.0 demonstration tool. The tool enables the optimization of paper structures with respect to materi-al costs while maintaining desired quality measures. We have specifically pledged in-programme support for a multitude of essential physical paper characteristics to be used for multiobjective optimiza-tion purposes. The Paperrin tool produc-es plausible optimization results which are still to be verified physically.

All models used in Paperrin are im-plemented with the Numerrin 4 model-ling language developed by Numero-la. Multiobjective optimization problems are solved using an SQP solver and the weighted sum method.

The Paperrin tool allows the user to set optimization problem objectives and constraints as inputs. The problem solu-tion output resembles a composition of paper layers and their constituents. Users are encouraged to experiment with avail-able inputs and settings to see their result-ing impact on the paper quality properties.

The Paperrin tool’s main tab con-tains parameter input fields, certain pa-per composite property fields and output fields for physical magnitudes character-izing the paper structure as a whole as well as its individual layers. Certain other input parameters (such as elastic modu-lus dependency) are presented on a sep-arate ‘Model properties’ tab. The tool’s in-put fields are used to launch the simula-tion, which produces outputs in either nu-merical or graphical format.

The application’s default work flow in-cludes the following operational phases: 1. Understanding the requirements of

the new paper product, 2. Defining

objectives and constraints for the optimization procedure,

3. Checking the physical model parameters,

4. Performing the optimization,5. Evaluating and displaying the

results, and 6. Altering the optimization settings

and repeating the optimization process until the desired paper structure is achieved.We have devised three real-life prac-

tical scenarios of paper structure de-sign in cooperation with paper industry professionals. The scenarios are intend-ed to provide us with guidance through the planning, implementation and testing phases of our research.

5. Future plans and key development needs

The EffNet WP 7 project will continue the work initiated in the QVision project un-der EffTech. The objective of WP 7 is to provide means for managing the unifor-mity of web material at the macro scale, in particular the continuous over-time and machine-wide development of bulk and surface microstructure. Until very recent-ly, wide ranges of variability have been wholly unobservable in production. These are now accounted for, largely due to QVi-sion. The goal is to characterize and man-age the web at the following scales: • Sub-formation scales of surfaces

affecting printing; mottling effects, deep surface pores in uncoated surfaces

• Fast web-wide temporal structures (residual variation at a resolution of 10 cm to 1 m in machine direction and 1 mm in cross direction)

• Web-wide structures at high resolution (sub mm)

• Formation scale structure, including fibre orientation

Page 111: Forestcluster EffTech programme report

111

• Slow web-wide temporal structures (MD and CD profiles) The characterization will be based on

digital imaging, which has been the most rapidly developing area of electronics over the last decade. Inference based on imag-es will be supported by advanced use of structural models and other prior informa-tion according to the principle of inverse and Bayesian methods. The management of uniformity will consist of a combination of feedback control and diagnostics.

2-D web-wide variability analyser: work will be continued under the EffNet programme as indicated above. Further-more, outside the programme, the inte-gration of the Viconsys defect detection imaging system and QCS systems so that the functionalities developed can be com-mercialized on an appropriate platform. It is also assumed that commercialization of some functionalities will be carried out by SMEs.

High-resolution structural mea-surement: The produced data will be further analyzed in EffNet. 1) Surface to-pology and 2) Surface fibre orientation: the developed multi-light measurement device will be extensively used to test, analyze and further develop the lighting setup, imaging methods, computational methods and image noise suppression. 3) Optical dot-gain variation: testing of the printed transparent method will contin-ue, as will testing of projected patterned light for printability analysis. The compar-ison of tomographic data (from x-ray and optical coherence) against measured sur-face topography and surface fibre orien-tation will continue. Methods and setups will be further arranged for multi-spec-tral imaging for high resolution structural measurements.

As a collaboration between high and rough-resolution structural measurement, the development of methods to compute fibre orientation from optical transmission and reflectance images is being contin-ued. The aim is to determine the materi-

al distribution of paper based on its opti-cal transmission.

Rough-resolution structural mea-surement: The produced results will be used in EffNet. The available measure-ment, processing, analysis, and control methods, as well as experiences with the Testaa 1 data will be used to plan and im-plement future experiments in order to obtain appropriate data.

Particle and fibre analysis: The re-search started in QVision will be contin-ued in the PulpVision project, which con-sists of four parallel Tekes ERDF projects. The next development steps will include: 1) Field tests of image analysis methods with a more representative set of pulp samples, and combined use of reflection and transmissions images for feature ex-traction. 2) Implementation of spectral and fluorescent imaging results in image analysis. 3) Testing image analysis meth-ods with images obtained from pulp sus-pensions. 4) Further development of im-aging systems for pulp suspensions.

2-D control methods: The research continues as a doctorate thesis and in the EffNet programme. New algorithms and disturbance scenarios will be tested with simulations. The performance of 2-D con-trol of residual variations, integrated with current CD and MD controls, will be ana-lyzed with current actuator systems (e.g. dilution headbox) and possibilities for faster actuators will be studied. The ben-efits of continuous full-web imaging, such as accurate actuator response identifica-tion, web shrinkage estimation, and re-duced uncertainty in variation estimates, will be integrated into the 2-D control concept.

Innovation of paper structure: Fur-ther steps of the project include devel-opment of the optical model in cooper-ation with optics research groups par-ticipating in the EffNet programme. The model needs to be improved with regard to physical practicality while maintaining current computational efficiency. Also, the

Page 112: Forestcluster EffTech programme report

112

mechanical model is to be enhanced via utilization of our recently developed two-dimensional treatment instead of point-wise simulations. The specifications pro-duced within the framework of the QVi-sion project will be utilized as guidelines for further practical development work. Our featured modelling methods can be also used for modelling and testing fibre-based materials developed by other Eff-Tech participants.

6. Exploitation plan and impact of results

2-D web-wide variability analyser: Of the three sub-applications of the 2-D vari-ability analyser, the digital illuminated ta-ble has been demonstrated and is ready for the product development phase. The analysis of 2-D transmittance variation over several kilometres in the machine di-rection needs further refinement as well as test cases, such as identification of the shape of CD actuator response. This will be included in the research tasks of the EffNet programme. Estimation of 2-D ba-sis weight variation based on transmit-tance maps and scanning basis weight measurement requires further research. This research, also included in the Eff-Net programme, is strongly connected to developing 2-D control applications (see below) which addresses the questions of which variables should be controlled and whether different target variables should be applied at different frequency rang-es. Essential to all exploitation avenues is the need for more uniform illumination in defect detection systems or an effec-tive way of compensating computational-ly for variations in illumination. The latter is expected to be relatively straightfor-ward with appropriate calibration imaging which, however, was not available during QVision. The former is a more long-term development issue and should be carried

out by defect detection system vendors.High-resolution structural mea-

surement: The produced data will be fur-ther analyzed in EffNet. Methods related to surface topology, surface fibre orien-tation, and optical dot gain variation will be developed further. Tomographic data (from x-ray and optical coherence) will be used as the reference for optical mea-surements. Spectral imaging will also be evaluated further.

Rough-resolution structural mea-surement: The produced results will be used in EffNet. The available mea-surement, processing, analysis and con-trol methods will be used to plan and im-plement future test runs to collect ap-propriate process data and test control schemes. In their current stage, the re-sults are not directly applicable to indus-trial processes. The feature-based image registration method will be further devel-oped and applied at least to dirt counting and classification.

Particle and fibre analysis: The project involves co-operation between four research groups in testing, imaging and machine vision techniques in differ-ent practical cases using different pulp and paper samples. The studies will be continued under the large-scale four-year Tekes/ERDF project PulpVision. The Pulp-Vision project works to develop new im-aging and machine vision based measure-ments for use in the analysis and control of wet-end pulp and paper making pro-cesses. The project involves eight partic-ipating companies, mainly SMEs.

2-D control methods: The research continues under the EffNet programme. The benefits of continuous full-web imag-ing, such as accurate actuator response identification, web shrinkage estimation, and reduced uncertainty in variation es-timates, will be integrated with the 2-D control concept.

Surface chemistry: Industrial inter-est is focused on ASA dosage measure-ment and analysis. The results closely

Page 113: Forestcluster EffTech programme report

113

meet the objectives of the QVision proj-ect’s surface chemistry task.

Innovation of paper structure: The produced results are directly exploitable in the production of design and research tools for new optimized multilayer paper products. The end users of the results are R&D personnel in the paper and board in-dustry and their equipment and raw ma-terial suppliers.

7. Publications and reports

Kälviäinen H. (2010a). “Image Print Quality Assessment by Machine Vision”, Sino-foreign-interchange Workshop on Intelligence Science and Intelligent Da-ta Engineering (IScIDE 2010), 3–6 Jun 2010, Harbin, China. Invited talk.

Kälviäinen H. (2010b). “Machine vision based quality control from pulping to pa-permaking for printing”, 10th Internation-al Conference on Pattern Recognition and Image Analysis (PRIA-20-2010), St. Pe-tersburg, Russian Federation. Invited talk.

Ohenoja M. (2010a). “Application feasi-bility study of 2D control methods”, Re-port series A, Control Engineering Labo-ratory, University of Oulu.

Ohenoja M., Isokangas A., Leiviskä K. (2010b). “Simulation studies of paper machine basis weight control”, Report series A, Control Engineering Laboratory, University of Oulu.

Ohenoja M., Ylisaari J., Leiviskä K., Ritala, R. (2010c). “Analyzing 2-Dimen-sional Variation Based on Scanning and Imaging Measurements”, TAPPI Paper-Con 2010, 23rd Process Industry Reliabil-ity and Maintenance Conference, 2–5 May 2010, Atlanta, GA.

Panjeh Foulagaran M. (2010a). “Imag-ing and Characterisation of Dirt Particles in Pulp and Paper,” M.Sc. thesis, Lappeen-ranta University of Technology.

Panjeh Fouladgaran M., Mankki A., Lensu L., Käyhkö J., Kälviäinen H. (2010b). “Automated Counting and Char-acterization of Dirt Particles in Pulp,” In-ternational Conference on Computer Vi-sion and Graphics (ICCVG 2010), 20–22 Sep 2010, Warsaw, Poland.

Raunio J.-P., Tirronen V., Ritala R., Nironen I., Rossi R., Kärkkäinen T. (2010a). “Web-Wide Diagnostics of Paper Properties based on Fault Detector System Images”, TAPPI PaperCon 2010, 23rd Pro-cess Industry Reliability and Maintenance Conference, 2–5 May 2010, Atlanta, GA.

Raunio J.-P., Tirronen V., Ritala R., Nironen I. (2010b). “Web-wide diagnos-tic of paper: utilization of light transmit-tance images in analysis of paper proper-ties”, Control Systems 2010, September 2010, Stockholm, Sweden.

Takalo J., Timonen J., Sampo J., Silt-anen S., Lassas M. (2010a). “Wavelet-based inverse method as a tool in paper quality assessment”, Physics Days 2010, 11–13 March 2010, Jyväskylä, Finland. Poster presentation.

Takalo J., Timonen J., Sampo J., Silt-anen S., Lassas M. (2010b). “Wavelet-based inverse method as a tool in paper quality assessment”, Inverse Problems: Modeling & Simulation, 24–29 May 2010, Antalya, Turkey. Invited talk.

Ylisaari J., Ritala R. (2010). “Web-wide transmittance imaging measurement vari-ability analysis”, Control Systems 2010, September 2010, Stockholm, Sweden

Page 114: Forestcluster EffTech programme report

114

Re-engineering paper (REP)

Project Manager

Duration of the project

Project budget

Project partners

Biosafe – Special Laboratory Services Ltd, (Atte von Wright)

FPInnovations, Canada (Tetsu Uesaka)

Aalto University, Complex Systems and Materials group (Mikko Alava)

University of Helsinki, Department of Physics (Ritva Serimaa)

University of Helsinki, Observatory (Kari Lumme)

Tampere University of Technology, Biological Physics and Soft Matter group (Ilpo Vattulainen)

VTT Technical Research Centre of Finland (Erkki Hellén); also includes the former KCL role

Erkki Hellén, [email protected]

1.6.2008-30.8.2010

EUR 2,050,000

Role of participating organization

Safety assessment of materials and products used and generated in the project using in vitro toxicity tests.

Development of numerical tools to simulate the forma-tion of nano-scale network structures under different process conditions.

Simulation of aggregation dynamics of filler particles in liquid or nanofibre suspension. Calculation of mechanical properties of structures formed from aggregates.

Structural characterization of materials and products using X-ray methods.

Development of methods to estimate the optical properties of simulated structures including cellulose nanofibrils. Packing simulations of filler aggregates.

Development of atomic, molecular and coarse-grained models for nanocellulose and studying its interactions with inorganic particles.

Coordination of the project and 1) production and characterization of materials and structures, 2) coordination of modelling efforts and modelling of strength and rheological properties of fibre networks with nanocellulose, 3) significant experimental contribution, especially providing data for modelling purposes and demonstration of new product and process concepts, 4) analysis of sustainability and safety of new materials, process concepts and products.

Page 115: Forestcluster EffTech programme report

115

AbstractThe Re-engineering paper (REP) project studies technologies to enable resource-efficient and

sustainable renewal of fibre-based products. The focus is on the potential of nanofibrillated

cellulose (NFC) in 1) pure NFC, 2) filler-NFC, and 3) fibre-NFC structures. The project has iden-

tified and demonstrated several product and process concepts which utilize cellulose nano-

fibres and show new combinations of characteristics, and screened their environmental as-

pects. New laboratory-scale facilities have been established (manufacturing, fractionation and

safety of nanocelluloses; production of controlled sheet structures; characterization of nano-

celluloses and structures). The importance of both nanocellulose quality and process condi-

tions with respect to sheet structures has been shown. State-of-the-art simulation tools for

multi-scale modelling have been developed in a network of five top research teams and inte-

grated in the Simantics modelling and simulation environment. The safety of nanocelluloses

has been assessed using in vitro toxicity tests.

Keywords: nanocellulose, papermaking, simulation, process concept, resource efficiency

Page 116: Forestcluster EffTech programme report

116

1. Project background

Cellulose nanofibres show great poten-tial in expanding the use of sustainable raw materials in the forest products in-dustry. They allow paper and board prod-ucts to be produced with much lower con-sumption of raw materials, water and en-ergy. They can also be used to develop completely new fibre-based products with characteristics that cannot be achieved with present-day raw materials.

Making optimal use of nanomateri-als requires an overall picture of how the characteristics of the end product are in-fluenced by the formation and dewatering mechanisms containing these new mate-rials. Finding ways to engineer products based on novel raw materials demands a profound understanding of the interac-tions between them. In order to speed up product and process development and to reduce costs and risks, we need virtual product models which take into account the properties of new nanomaterials and other essential structural components. Measurements of material properties and well-controlled laboratory and pilot trials represent an important part of this chain. Sustainability and product safety are oth-er important considerations when choos-ing between the available options.

2. Project objectivesThe REP project aims at the renewal of paper and board making with cellulose nanofibres. The long-term target of the project is to develop• a resource-efficient technology for

nanofibre-based sheet production that is

o economically profitable o complies with sustainability

requirements

o enables radical reductions in the consumption of raw materials, water and energy in the production process

o leads to new product.• advanced multi-scale models, which o help in identifying the potential

of new materials o enhance product development o guide process development

Short term goals (2 years, till 06/2010):• To select and test the most feasible

new materials and production technologies with respect to resource efficiency

• Demonstrate paper-like structures, which

o are based on new nano- and biomaterials,

o offer completely new combinations of characteristics

• Identify techniques required to characterize nanomaterials and the structures containing them

• Produce structures containing nanoparticles in a controlled way

• Demonstrate a multi-scale model for paper-like structures containing nanocellulose and apply it to solving a case problem defined by experiments

• Screen for environmental and safety effects of promising new materials and production technologies to guide the project in a sustainable direction

Page 117: Forestcluster EffTech programme report

117

3. Research approach

The Re-engineering paper (REP) project is built around the belief that a very im-portant category of future forest products will be produced from renewable wood-based raw materials in reel-to-reel oper-ations. These future “paper” products will have a greatly expanded property space compared to today’s products and will thus compete in entirely new markets. This transformation is driven by the rapid developments in renewable material sci-ence and will require a completely new manufacturing platform to draw on.

To meet these targets, the REP proj-ect has explored the possibilities offered by cellulose nanofibres. Figure 1 illus-trates the main ideology of the project. The project has concentrated on web-based products and processes. Experi-ments have been intentionally limited to the laboratory scale. The emphasis has

been on new products and processes, particularly on forming structures with novel physical properties. Coating appli-cations and applications of cellulose nano-fibres in current papermaking processes have been excluded.

REP has concentrated on laboratory-scale demonstrations of products contain-ing nano-fibrillated cellulose, screening of potential resource-efficient technologies, and starting the development of a virtu-al product modelling environment. Sever-al potential product concepts and efficient processes have been identified and dem-onstrated. A demo of the virtual product environment has been carried out. Envi-ronmental and safety screening indicat-ed no major concerns. The next step is to continue up-scaling of the most po-tential technologies and show how nano-celluloses together with these technolo-gies lead to extended properties for fibre-based products.

Figure 1. The Re-engineering paper (REP) project aims at the renewal of paper and board making using cellulose nanofibres. This requires rethinking of existing production processes but offers possibilities for new, high-value products.

Page 118: Forestcluster EffTech programme report

118

4. Results

4.1 Facilities to make and fractionate nanofibrillated cellulosesControlled production of NFC at the lab-oratory scale has been established. The nanofibrillated materials were produced with a Masuko grinder. The effect of dif-ferent process parameters on nanofibril production was evaluated and the repro-ducibility of the process was verified in terms of energy consumption and end product quality. The production rate of the grinder is about 640 g/h (dry materi-al) for 1 pass of NFC and about 145 g/h (dry material) for 6 passes of NFC.

Mechanical fractionation devices were constructed at VTT. They can be used for fractionation of different fibre/particle suspensions in the particle size range of 0.1–150 micrometers. The devices were used to fractionate VTT Masuko ground 1-pass and 6-pass suspensions to five

size fractions. The largest fibre fragments were removed during fractionation and the fibril size, both length and width, de-creased as the fractionation proceeded.

4.2 Characterization of nanocelluloses requires advanced methodsNanocellulose is a novel raw material re-quiring different characterization meth-ods than ordinary wood fibres. Figure 2 illustrates the methods used to charac-terize nanocelluloses and structures con-taining them. Table 1 summarizes the sta-tus of the characterizations methods. In conclusion, several methods need to be used to characterize NFC. Aspect ratio, degree of fibrillation, fibril size and the amount of unfibrillated material can be evaluated with a combination of viscosi-ty, transmittance (or turbidity) and SEM measurements. Viscosity measurement is sensitive to small fibrils, whereas trans-mittance/turbidity gives an estimation of

Figure 2. Main characterization methods tested.

Page 119: Forestcluster EffTech programme report

119

the amount of larger particles. The phys-ical properties of NFC sheets also provid-ed additional information. The crystalline structure, nano-porosity and specific sur-face area of NFC materials can be ana-lyzed with delicate measuring techniques such as x-ray microtomography, x-ray dif-fraction, and wide-angle and small-angle x-ray scattering techniques.

However, there are still challenges, mainly due to the intrinsic nature of NFC. The very wide size range together with the fibrillated shape makes it difficult to determine the actual particle size. This is further complicated by the branching of fibrils. Nanofibrillated celluloses also flocculate very readily and dispersing of NFC material prior to measurement is es-sential. Although indirect measurements (such as viscosity) give valuable informa-

tion on nanocellulose quality, they can-not be used to calculate actual NFC prop-erties such as length or thickness. More-over, methods capable of rapidly deter-mining NFC quality in process conditions are needed.

4.3 Safety of nanocellulosesNanocelluloses have many potential appli-cations but are subject to the same safe-ty concerns as other nanomaterials. As with other nanomaterials, the eventual bi-ological effects cannot be predicted sole-ly from the chemical nature of cellulose. The size, shape, aggregation properties, among others, still poorly understood fac-tors, may affect the interaction of nano-cellulose particles with cells and living or-ganisms. A literature review showed that

Table 1. Preferable measurement techniques for NFC characterization. Red indicates methods that are important for NFC quality control and blue indicates methods still under development.

Property

Fibril widthFibril lengthHydrodynamic radius

Aspect ratioAmount of unfibrillated fibresSize of fines and fibre fragmentsCrystallinity %Crystallinity sizePorositySpecific surface areaGel pointWater retentionStability in dispersionSurface characterization

MethodPreferable

SEM, image analysis

ViscosityTransmittance,TurbidityFractionator analysis

WAXSWAXS/ width, lengthSAXS/ nano scaleSAXS

Sedimentation volumeSedimentation volumeAFM / nano-scale roughness

Best alternativeAFM, image analysisSEMNanosight

Microscopy methodsLight microscopy, image analysisLight microscopy

XRDXRD/ width, length

Fysisorption / BET

WRVSettling rateSEM / qualitative

Second alternativeTEM, image analysis

Photon correlation spectroscopy

Luukko method, Fractionator analysisSEM, Fractionator analysis

Viscosity

Page 120: Forestcluster EffTech programme report

120

only a few studies on the safety of nano-celluloses have been published.

The safety of nanocelluloses was as-sessed using in vitro bioassays. A new testing procedure for analyzing cyto- and genotoxicological properties of nanocellu-loses was developed. Immediate cytotox-icity of nanocelluloses and their sub-lethal effects on cultured human cells as well as their ability to damage DNA or chromo-somes were studied.

The NFC samples tested can be con-sidered non-toxic (see Table 2). The pure nanocelluloses did not show any cytotox-ic or genotoxic effects. The only indica-tions of toxicity were found for the small-est size fractions, which showed an indi-cation of transient morphological chang-es in human cervix carcinoma (HeLa229) cells in Highest Tolerated Dose (HTD) testing. These samples were highly con-centrated and even in this case the mor-phological changes were not severe. More tests will be needed if the finest particles are to be applied in high concentrations

 

Table 2. Summary of the results from the cytotoxicity and genotoxicity tests. Some of the samples (marked as –) were not tested or they were not microbially clean enough for reliable testing.

and to identify the actual cause of mor-phological changes.

4.4 Nanocellulose characteristics determine product propertiesThe effect of NFC on sheet properties was demonstrated by mixing various amounts of nanocelluloses with two types of fibre commonly used in papermaking (birch kraft and TMP). NFC clearly increases the tensile index, the maximum of which is generally a function of NFC content. There are clear differences between different nanocelluloses. The most dramatic prop-erty change was measured for air perme-ability, which decreased by two orders of magnitude at less than 15% NFC content.

As Figure 3 shows, the mechanical properties of films made exclusively of NFC depend strongly on the NFC type: it is possible to simultaneously increase both stretch and strength with the right selection of NFC. The tensile index val-ues for the Daicel G NFC material are al-

Page 121: Forestcluster EffTech programme report

121

 

Figure 3. Mechanical properties of NFC films depend strongly on NFC quality. Blue diamonds indicate tensile index values.

 

Table 3. Main characteristics of nanocelluloses.

Page 122: Forestcluster EffTech programme report

122

most twice as high as for films made of NFC 100-5. The stretch (over 20% for the best case) and TEA index values are al-so high. The mechanical properties of VTT NFC materials are close to those of Dai-cel NFC materials. When comparing these results with the properties of nanocellu-loses (Table 3), the best strength values are obtained with long fibrils with high enough aspect ratio and high enough branched structure.

4.5 Product properties influenced strongly by process conditions and solventThe achievable property space of pure nanocellulose sheets is far beyond what can be obtained with ordinary fibres. It is possible to generate dense translucent films as well as fully opaque plastic-like tough material just by altering the drying speed, temperature and suspension con-centration. Similar effects can be obtained by using different solvents. Film-like, translucent structures are gained when nanocellulose structures are formed from

aqueous solutions. Paper-like, opaque structures result when nanocellulose films are treated with ethanol.

The solvent also affects the struc-ture of nanocelluloses. For example, the specific surface area of commercial NFC MF40-100 was 6.4 m2/g in dry powder, 180 m2/g in ethanol and 660 m2/g in wa-ter. Surprisingly, the NFC films showed nanometre range porosity only when formed in an ethanol suspension. Ethanol also increased the micrometer and nano-metre scale porosity of dry sheets.

4.6 The most potential product concepts utilizing cellulose nanofibresA product survey was carried out to in-dentify the most profitable product con-cepts. The following potential products were identified in which the utilization of NFC material seems technically and eco-nomically promising: 1) Strong board/paper with good

barrier and printing properties for packaging applications

 

Figure 4. Roughness and dimensional stability of filler-nanocellulose composites is significantly better than for paper and comparable to plastic substrates used in electronic printing.

Page 123: Forestcluster EffTech programme report

123

2) Label papers3) High collecting efficiency filters4) Packaging applications with thin

insulation layer5) Polymer films6) Transparent barrier films for

packaging applications7) Thin, strong, thermally stable and

smooth structures for electronic applications

8) Laminated papers9) Insulation materials.

To demonstrate the usability of NFC in different kinds of products, the follow-ing demonstrations were made at the laboratory scale: 1) Filler-NFC composite (>80% fillers) for printed electronics ap-plications, 2) Nanocellulose film as a bar-rier, and 3) 50% filler containing SC pa-per. These demo products span a wide range of structural properties and each of them also has considerable market po-tential. The following chapters discuss the main outcomes of these demonstrations.

4.7 Filler-nanocellulose composites as substrates for printed electronicsMechanically stable and flexible filler-NFC sheets provide new property com-binations such as excellent optical prop-erties, smooth surface, and good dimen-sional stability. These are essential prop-erties for substrates of printed electron-ics. The filler content of a sheet can be as high as 90%.

Figure 4 shows that filler-NFC com-posites have smoother surfaces than the smoothest paper on the market (photopa-per) and dimensional stability compara-ble to plastic substrates used in electron-ic printing. The smoothness level of fill-er-NFC substrates is similar to that of the reference plastic film Mylar A (PET), which is a commonly used substrate for print-ed electronics. In addition, the filler-NFC substrates gave promising results in pre-liminary inkjet printing trials. The resis-tances measured for various line widths were lower for the kaolin-based substrate

Figure 5. Nanocellulose films are good oxygen barriers at low humidity.

0,0001  

0,001  

0,01  

0,1  

1  

10  

100  

0   50   75  

OP,  (cm3  μm

/m2  day  kPa)  

RH,  %  

Polylac-c  acid,  PLA  

Cellophane  Poly  vinyl  alcohol,  PVOH  Daicel  NFC  

Ethylene  vinyl  alcohol,  EVOH  

Masuko  NFC  

Carboxymethylated  cellulose  

MFC  

Carboxymethylated  cellulose  

Masuko  NFC  

Masuko  NFC  

TEMPO-­‐oxidized  NFC  

Page 124: Forestcluster EffTech programme report

124

than for the reference PET film. Hence, filler-NFC composites have potential for printed electronic applications.

4.8 Nanocellulose films: potential biobarriers for packaging applicationsA common approach to developing more sustainable packaging materials is to re-place synthetic materials with renewable ones. Here we address the barrier prop-

 

Table 4. Differences between high filler content and reference SC papers.

erties of nanocellulose films. Figure 5 shows that films made of

nanocellulose (~60 g/m2) have very low oxygen permeability at low humidity. The measured permeability values are in the same range as those recently reported for cellulose-based films as well as synthet-ic films such as EVOH and PVOH. Nano-celluloses thus have the potential for use in multilayer packages as oxygen barrier layers in dry conditions. The major chal-lenges are the sensitivity to moisture and

Figure 6. Vision of the virtual product model environment and the first steps towards it. Parts completed in Phase 1 of the EffTech programme are shown in green.

Page 125: Forestcluster EffTech programme report

125

development of a cost-efficient manufac-turing technology.

4.9 Increasing the filler content of SC paper to 50%For some grades, one of the most effec-tive ways to reduce energy consump-tion in papermaking is to increase the filler content. Here, we tested the ef-fect of increasing the filler content of SC paper from 24% to 48% by simultane-ously adding NFC using a standard lab-oratory sheet former. The main findings were (Table 4): 2–4% NFC was sufficient for wet strength; optical properties were clearly better than the reference case; and dry strength dropped about 40%. Interestingly, the limiting factor seems to be the dry and not the wet strength. We also estimated that the carbon foot-print would be around 15% lower for the high filler case.

4.10 Towards virtual product modellingEngineering products containing novel raw materials will present a major chal-

Figure 7. Combining several models enables multi-scale analysis ranging from atomistic simulations to macroscopic product properties.

lenge to the traditional development ap-proach, which is based almost solely on laboratory experiments and pilot- and full-scale testing. Virtual product model-ling offers a way to significantly speed up product development.

Figure 6 presents the long-term vi-sion: to build a virtual product model-ling environment in which different sim-ulation solvers can becombined to make predictions on product properties based on materials, microscopic structure and production processes. When such an en-vironment is combined with material da-tabases, the net result is a powerful tool for product development.

4.11 State-of-the-art simulation of materials, processes and productsIn REP, several models and simulation tools were developed as a joint effort between five research groups. The models describe structural, mechanical and optical proper-ties of paper-like structures based on par-ticle-level models covering materials and structures from the nano to macro scale, i.e., from atomistic simulations to macro-scopic product properties (Figure 7). In ad-

Page 126: Forestcluster EffTech programme report

126

dition to the detailed description of mate-rial characteristics, physical and chemical process conditions also can be varied dur-ing the formation of structures.

The work has concentrated on ad-vanced material modelling, which pre-pares the way for process and product applications in the EffNet program. The simulation models currently developed describe:• Cellulose nanofibres and their

material interactions at the atomistic and molecular levels

• Nanocellulose-water dispersion and forming of nanocellulose network structures

• Particle aggregation in varied flow and chemical conditions

• Structure formation at the micro-particle level

• Fibre networks with nanocellulose reinforcement

• Strength, elastic and optical properties of sheet structuresAlthough the emphasis has been on

nanocellulose applications, the adapted generic modelling approach can be ap-plied in many other industrial areas.

Although each simulation tool can be used independently, maximum advantage is gained by combining several solvers to-gether into a multi-scale and multi-phys-ics approach. The simulation tools are integrated with the help of the Siman-tics platform developed recently at VTT (https://www.simantics.org). It serves as an open high-level application platform where different computational tools can be easily integrated to form a common environment for modelling and simula-tion. In this project, the synergy achieved by combining different structural simula-tions was demonstrated in the case of fill-er-NFC substrate for printed electronics.

The main results from the simulations are:• The amorphous regions of

nanoscale fibrils strongly affect both

the elastic properties of single fibrils and the interaction force between fibrils.

• The theoretical minimum energy required for nanocellulose production is less than 2% of current energy consumption in pure mechanical grinding.

• Structures created from NFC are extremely sensitive to 1) the type of interactions, 2) the intensity and spatial distance of the interactions, 3) consistency, and 4) morphology of fibrils.

• Aggregation simulations, which combine physical flow conditions with the chemical environment, were shown to be effective at producing the experimentally measured aggregate size distributions for various shear rates.

• The stiffness of filler-NFC composites can be changed 50% by controlling the aggregate size distribution only. The best reflectance is obtained for loosely packed aggregates with the most narrow aggregate size distribution.

• For wet fibre networks, stretch is strongly affected by the disordered structure whereas strength and stiffness were only slightly affected. This has an important implication for paper machine runnability.

• Bond stiffness controls the stiffness of a wet fibre network and the adhesion properties of the bonds control the network strength.

5. Future plans and key development needs

Although the potential of cellulose nano-fibres has been demonstrated in this and several other projects, there are still sev-eral challenges to overcome before nano-

Page 127: Forestcluster EffTech programme report

127

cellulose-based products will be on mar-ket. A central challenge is to develop a sustainable web production technology, the main technical bottlenecks of which will be forming and dewatering. Thus, the next step is to continue up-scaling the most potential technologies and to show how a new material basis together with these technologies leads to extended properties for fibre-based products.

From the modelling point of view, the idea is to continue model development by applying models in solving industrial-ly relevant problems. Integration of the tools in the virtual product modelling en-vironment will not only enable multi-scale analysis but will provide easy access to combined tools. It will also ensure the us-ability of the models in the long run.

Finally, one crucial issue will be to en-sure the economical feasibility and sus-tainability of new technologies and prod-ucts. Thus far, LCA screenings of potential technologies have been made but these need to be refined once the technologi-cal alternatives have been finally deter-mined. Another essential point is to en-sure the safety of nanocelluloses, which is of utmost importance when considering both end users and those handling nano-materials on the production line. Crucial-ly, thorough understanding is needed of nanocellulose characteristics that are of relevance to product safety. This calls for efficient networking at the European lev-el in the field of safety and characteriza-tion of cellulose nanofibres.

6. Impact of results and industrial relevance

The experimental results achieved thus far have already shown that the range of products made from materials containing nanocellulose is much wider than can be achieved with current papermaking raw

materials. The LCA and safety screenings indicate that we are moving in a sustain-able direction. From the industry perspec-tive, this will create direct potential for new fibre-based products and opportuni-ties for resource savings within current technologies and products.

A direct impact of the research has been seen in the introduction of devel-oped or applied techniques in company-specific projects. The research experi-ments have also indicated the need for new processes and generated potential technologies. These technologies show substantial potential for saving energy and water in current papermaking pro-cesses, but they are also well suited to the manufacture of novel products. They have also led to invention notifications and patent applications.

The exploitation of the results has been activated through intensive dialogue with industrial partners. This has helped to steer the project while also ensuring fast technology transfer to companies. Several manuscripts related to the model-ling activities have been written. We hope that the modelling platform will transform the way new products are developed in the future paper industry.

To conclude, the work started in this project supports the renewal of the Finn-ish forest products industry, the econom-ic implications of which will be immense. New production equipment will be re-quired, new process chemistries will need to be developed and a range of new prod-ucts will be produced that meet various consumer needs. The ripple effect will be felt throughout the Finnish economy for many years. In the short term, the re-sults will increase the competitiveness of existing assets and grades against geo-graphic regions that enjoy intrinsic mate-rial advantages.

Page 128: Forestcluster EffTech programme report

128

7. Publications and reportsHellén, E.K.O., Maloney, T. C., Re-en-gineering paper – renewal from a mate-rial perspective, Forest Tech Europe 2008 conference 6.11.2008, Helsinki, Finland.

Hellén, E.K.O., Re-engineering paper us-ing nanocellulose and multiscale model-ing, Papermaking Research Symposium 2009, 1–4.6.2009, Kuopio, Finland.

Hellén, E.K.O., Re-engineering paper us-ing nanocellulose and multiscale model-ing, 2009 Intl Conf on Nanotechnology for the Forest Products Ind, 23–26.6.2009, Edmonton, Canada.

Illa,X. , Mohtaschemi, M., Puisto, A., Alava, M. J., Rheological Model for Un-stable Colloidal Suspensions in Pipe Flow, 2010 TAPPI International Conference on Nanotechnology for the Forest Products Industry, 27–29.9. 2010, Espoo, Finland.

Ketoja, J., Hellén, E., Lappalainen, J., Kulachenko, A., Puisto, A., Alava, M., Penttilä, A., Lumme, K., Paavilainen, S., Róg, T., Vattulainen, I., Vidal, D., Uesaka, T., Multi-scale modeling envi-ronment for nanocellulose applications, 2010 TAPPI International Conference on Nanotechnology for the Forest Products Industry, 27–29.9. 2010, Espoo, Finland.

Kulachenko, A., Uesaka, T., Simula-tions of wet fibre network deformation, Progress in Paper Physics Seminar, June 7-10, 2010, Montréal, QC, Canada.

Leppänen, K., Peura, M., Kallonen, A., Penttilä, P., Lucenius, J., Sievänen, J., Sneck, A., Serimaa, R., Characterization of nanofibrillated cellulose samples using x-ray scattering, microtomography, scan-ning and transmission electron microsco-py, 2010 TAPPI International Conference on Nanotechnology for the Forest Prod-ucts Industry, 27–29.9. 2010, Espoo, Fin-land.

Leppänen, K., Pirkkalainen, K., Pent-tilä, P.A., Sievänen, J., Kotelnikova, N., Serimaa, R., Small-angle x-ray scat-tering study on the structure of micro-crystalline and nanofibrillated cellulose, Journal of Physics: Conference Series. 247, p. 012030. 11 p., 2010.

Madani, A., Kiiskinen, H., Olson, J., Martinez, M., Fractionation of micro-fibrilated cellulose and its effect on the physical properties of paper composites, 2010 TAPPI International Conference on Nanotechnology for the Forest Products Industry, 27–29.9. 2010, Espoo, Finland.

McWhirter, J.L., Paavilainen, S., Järvinen, J., Róg, T., Vattulainen, I., Atomistic modeling of cellulose nanofi-brils: Elastic Properties, 2010 TAPPI In-ternational Conference on Nanotechnolo-gy for the Forest Products Industry, 27–29.9. 2010, Espoo, Finland.

Paavilainen, S., McWhirter, J.L., Róg, T., Vattulainen, I., Atomistic modeling of cellulose nanofibrils and their interac-tions, 2010 TAPPI International Confer-ence on Nanotechnology for the Forest Products Industry, 27–29.9. 2010, Es-poo, Finland.

Page 129: Forestcluster EffTech programme report

129

Paavilainen, S., McWhirter, J.L., Or-lowski, A., Róg, T., Vattulainen, I., Computational perspective to cellulose nanofibrils through atomistic simulations, 2010 TAPPI International Conference on Nanotechnology for the Forest Products Industry, 27–29.9. 2010, Espoo, Finland.

Paavilainen, S., Róg, T., Vattulainen, I., Analysis of twisting in cellulose nano-fibrils during molecular dynamics simu-lations, 2010 TAPPI International Con-ference on Nanotechnology for the For-est Products Industry, 27–29.9. 2010, Es-poo, Finland.

Paavilainen, S., Róg, T., Järvinen, J., Vattulainen, I., Modeling cellulose nano-fibrils with non-crystalline parts, Fysiikan Päivät 2010, 16–17.3.2010, Jyväskylä, Finland.

Pitkänen, M., Honkalampi, U., von Wright, A., Sneck, A., Hentze, H.-P., Sievänen, J.,Hiltunen, J., Hellén, E., Nanofibrillar cellulose - in vitro study of cytotoxic and genotoxic properties, 2010 TAPPI International Conference on Nano-technology for the Forest Products Indus-try, 27–29.9. 2010, Espoo, Finland.

Puisto, A., Mohtaschemi, M., Illa, X., Alava, M. J., Modeling the Rheology of Nanocellulose Suspensions, 2010 TAPPI International Conference on Nanotechnol-ogy for the Forest Products Industry, 27–29.9. 2010, Espoo, Finland.

Puisto, A., Mohtaschemi, M., Alava, M. J., Rheology changes in aggregating suspensions, Workshop on Chemo-hydro-dynamic patterns and instabilities, 28–30.10.2009, Brussels, Belgium.

Puisto, A., Alava, M. J., Aggregation of calcite particles in water suspension, Pa-permaking Research Symposium 2009, 1–4.6.2009, Kuopio, Finland.

Puisto, A., Alava, M. J., Aggregation of calcite particles in water suspension, Fy-siikan Päivät 2009, 12–14.3.2009, Otani-emi, Finland.

Vidal, D., Uesaka, T., Smoothed Dissi-pative Particle Dynamics Model for Pre-dicting Self-Assembled Nano-Cellulose Fi-bre Structures, 2010 TAPPI Internation-al Conference on Nanotechnology for the Forest Products Industry, 27–29.9. 2010, Espoo, Finland.

Page 130: Forestcluster EffTech programme report

130

Drainage and web formation

Project Manager

Duration of the project

Project budget

Project partners

VTT Technical Research Centre of Finland

Betamet

Etteplan Oy

Kemira

Metso Automation

Metso Paper

Senatti kiinteistöt Oy

Tamfelt

Janne Poranen, [email protected]

1.6.2008–30.6.2010

EUR 2,000,000

Role of participating organization

Project management and equipment construction.

Assembly together with VTT.

Mechanical design of the press section.

Design of retention chemical system.

Process automation.

Process design; planning and designing of press section layout.

Modification of main distribution room.

Design and supply of press felts and fabrics.

Page 131: Forestcluster EffTech programme report

131

Abstract The focus of the Drainage and web formation (SUORA) project was the development of a novel

papermaking research environment to serve the development needs of the Finnish forest clus-

ter. The research environment provides new opportunities to form networks between leading

expertise and to develop new competence. This, in turn, serves to support the generation of in-

novations and to boost the development of new products. The environment dramatically im-

proves possibilities to test new products, process concepts and new types of equipment in real-

istic papermaking conditions. Since the environment is smaller and more flexible than industrial

pilot machines, it enables cost-efficient and rapid development of new products and processes

for the Finnish forest cluster. SUORA gathers and combines knowledge from different sectors,

providing a fertile environment for creating new products and innovations. The SUORA environ-

ment is equipped with advanced on-line sensors for process monitoring and different research

needs. Detailed information, such as the behaviour of individual drainage components, can be

obtained using special measurements developed in-house. The SUORA research environment is

designed to be as flexible as possible: the short circulation set-up can be varied, different head-

boxes can be used, three different forming concepts are possible, and concepts for small (500 kg

dry pulp) and large (3000 kg dry pulp) raw material volumes are available ready to use.

Keywords: Drainage, web forming, pressing, research environment, papermaking, prod-

uct properties

Page 132: Forestcluster EffTech programme report

132

1. Project background

The focus of the Drainage and web for-mation (SUORA) project was the devel-opment of a novel papermaking research environment to serve the development needs of the Finnish forest cluster. The re-search environment provides new oppor-tunities to form networks between lead-ing expertise and to develop new compe-tence. This, in turn, serves to support the generation of innovations and to boost the development of new products.

The environment dramatically im-proves possibilities to test new prod-ucts, process concepts and new types of equipment in realistic papermaking con-ditions. Since the environment is small-er and more flexible than industrial pi-lot machines, it enables cost-efficient and rapid development of new products and processes for the Finnish forest cluster. SUORA gathers and combines knowledge

from different sectors, providing a fertile environment for creating new products and innovations.

2. Project objectivesThe main objective of the project was to develop a unique and convertible paper-making research environment which uti-lizes latest papermaking technology. The environment provides flexible and cost-effective research with close operation between companies and research orga-nizations. The goals of the project are to accelerate and intensify the development work of companies and to pilot new ideas towards industrial solutions. During the project, a wet pressing and sampling unit was developed and integrated as part of the existing research infrastructure, and research work was started.

   

Figure 1. Position of the SUORA environment in the development chain of new products and processes.

Page 133: Forestcluster EffTech programme report

133

3. Research approach

SUORA was implemented as an invest-ment project and was separated into four tasks:Tasks 1 & 2• Unit and sampler design• Mechanics, automation, control Task 3• Implementation • Materials purchasing• Production of parts • InstallationTask 4• Start-up

The press and sampling unit invest-ments were implemented between sum-mer 2009 and spring 2010. To meet the needs of newer and much larger process-es, a new monitoring room, pump room and electricity room were also construct-

ed and extensive process modifications were carried out. The research environ-ment was started up and its first sample paper rolls produced in May 2010. Sever-al training sessions have been held joint-ly with the machine supplier and special emphasis has been placed on work safe-ty issues.

4. Results

4.1 Process layoutThe process layout of SUORA is present-ed in Figure 2. There are three different process options for trials:• Once-through (no circulation)• Small volume circulation (15 m3)–

500 kg dry pulp• Large volume circulation (100 m3)–

3000 kg dry pulp

Figure 2. Process layout.

Page 134: Forestcluster EffTech programme report

134

4.2 Forming and press section layoutsThree different forming geometries (Figure 3) can be studied in the SUORA environment: gap, hybrid and fourdrinier. The gap former is a shoe-blade type with no forming roll.

Key technical specifications for the forming section: • Web speed 2500 m/min• Max. headbox flow 240 l/s/m • Web width 300 mm • Fabric width 500 mm

The press section (Figure 4) consists of a 1-nip press with a 350 mm extended nip and a sampling unit. Technical spec-ifications:• Press o 1-nip geometry o Nip length 350 mm (options

250 & 450 mm) o Max line load 2000 kN/m o Web width ~250 mm o Belt width 700 mm• Sampling o Core diameter 6” o Vacuum-equipped sampling

Figure 3. Gap and hybrid forming setups.

Either small or large volume circula-tion is typically used. The circulation time for small volume circulation is in the order of 5 min and in large circulation 30 min (based on 120 l/s/m headbox flow).

Two 100 m3 tanks are currently avail-able, one for process water and one for pulp. The tank capacity will be more than doubled during 2011, enabling multicom-ponent raw material systems to be run.

Stock preparation from dry rolls is possible using the facilities at the Met-so Jyväskylä plant. For pulp refining, KCL’s facilities in Otaniemi are available for use.

A4 paper samples from SUORA can be dried at the laboratory scale at VTT. Roll samples can be currently dried at KCL, or after VTT’s new drying environment is commissioned in 2011, also at the Jyväs-kylä plant. Surface treatments for dry pa-per samples can be performed at various facilities throughout Finland, such as KCL or the Forest Pilot Center (FPC). VTT’s new cutting-edge laboratory-scale sur-face treatment environment at Otaniemi enables the development of radically new kinds of fibre-based products.

GAP FORMER HYBRID FORMER

Page 135: Forestcluster EffTech programme report

135

Figure 4. Press section layout.

 

process water sampling: Each dewatering element in the former and press section is equipped with flow rate measurement via a measurement weir. The total number of weirs is 22. Process water sampling units have also been developed.

• Wet-end analyses: Basic measurements such as pH, temperature and conductivity are continuously measured at multiple locations, and the following can also be measured on-line:

o Dissolved calcium and other multivalent cations

o Various titrated quantities such as alkalinity, dithionite, sulphur dioxide and dissolved starch

o Gas content and dissolved oxygen content

o Charge o COD o Redox potential o Volatile compounds in process

waters using electronic nose• Fractionator: FracOn analyser

developed by Metso Automation,

rolls, d=300 mm and d=640 mm

o Roll samples can be achieved up to 1500 m/min with 50% dry content (depending on press conditions and paper grade)

4.3 Special measurementsWhen studying the fundamentals of web forming and drainage, various special measurements are essential. The SUORA environment is equipped with several on-line sensors and special measurement in-struments to provide a close-up view of process conditions and web characteris-tics. Key measurements include:• Forming geometry: The

environment is equipped with a customised camera system which records the exact location of the headbox and the jet coming out of the headbox. With the help of image analysis software the jet length and impingement angle can be precisely determined.

• On-line dewatering rate and

Page 136: Forestcluster EffTech programme report

136

measures the fractionation of fibres in a pulp sample

• Friction between vacuum box and fabric: friction force acting between the wire and suction box lid can be measured using three force sensors

• On-line formation: optical formation of the web can be analyzed

When running trials, on-line tools for data acquisition and reporting are essen-tial. A Savcor Wedge process diagnos-tics system was installed in the SUORA environment in spring 2010. The Wedge system enables plotting of on-line trends and calculation of trial point specific infor-mation, including paper-laboratory data, during the trial.

Figure 5. Press section and sampler.

5. Exploitation plan and impact of results

A joint research project has been launched with an industrial consortium using the current papermaking research environment. The Fundamentals of Web Forming project (SUORA-KH) was a con-sortium project between VTT and five For-estcluster Ltd owner companies: UPM-Kymmene, Kemira, Tamfelt, Metso Pa-per and Metso Automation. Project time frame: 1.8. 2008–30.9.2010. The objec-tive of the project is to develop a unique papermaking research environment and efficient technologies for papermaking in co-operation between the project part-ners.

Results from laboratory-scale studies carried out in the EffNet programme will

Page 137: Forestcluster EffTech programme report

137

be verified in the SUORA environment. Example of SUORA’s role as a test envi-ronment prior to production-scale work: Verification of project results• QVision results => New

measurement and control systems• POJO ideas => Advanced process

solutionsPotential of new products• REP => scale-up of NFC based

productsHigh consistency papermaking• To save energy, raw materials and

water

6. Publications and reportsPoranen, J., VTT:n paperinvalmistuksen tutkimusympäristö edistää yritysten no-peaa ja kustannustehokasta tuotekehit-ystä, VTT:n media-aamiainen 18.11.2008

Poranen, J., VTT’s new research infra to develop fibre processes (SUORA), PulPa-per 1.6.2010

Page 138: Forestcluster EffTech programme report

138

Future paper and board making technologies (TuPaKat)

Project Manager

Duration of the project

Project budget

Project partners

Metso

VTT Technical Research Centre of Finland

Futura Marketing Oy (Henrikki Tikkanen, HSE; Juha-Antti Lamberg, HUT; Tomi Nokelainen, TUT)

Innovaatiopalvelu Pekka Koivukunnas Oy

SW-Development

TRIZ Oy (Kalevi Rantanen)

University of Oulu

Juha Lipponen, [email protected]

1.6.2008–30.6.2010

EUR 400,000

Role of participating organization

Project management / Juha Lipponen

Project coordination and proposal preparation work

Scenario work

Innovation work

Technology and business development simulation platform

A vision of the forest industry and their product

Ideation session with Paper Machinery and Process students

Page 139: Forestcluster EffTech programme report

139

Abstract The TuPaKat project has made three alternative paper and board consumption scenarios un-

til 2050 and three roadmaps until 2030. The roadmaps focused on reduction of energy and

raw material use, ideal papermaking and fiber-based products for printed electronics. In addi-

tion, the project group proposed four most influential technology concepts for future papermak-

ing (open base sheet + functional finishing; foam line; agile and modular papermaking; press-

less papermaking) and eight miscellaneous technology and/or business development proposals.

The TuPaKat project has been active in gathering project ideas and developing them to proj-

ect proposals. The work has been widely utilized in the planning of the second phase of the Eff-

Tech program.

Keywords: New technology, radical innovation, paper making, board making, scenario,

ideation.

Page 140: Forestcluster EffTech programme report

140

1. Project background

Starting point of the project is to devel-op ideas for new radical production tech-nologies for forest based businesses. The task divides in two: 1) Radical innovation of “traditional” paper and board making processes, and 2) through scenario work, draft technology needs for future fiber-based new products.

2. Project objectivesConcerning paper and board making tech-nologies in particular, the goal is to think the papermaking process as a whole, in-stead of more traditional approach of de-veloping papermaking within its sub pro-cesses. The Project produces technolo-gy needs for future development through scenario and innovation work.

3. Research approachStarting point of the project is to devel-op ideas for new radical production tech-nologies for forest based businesses. The task divides in two: 1) Radical innovation of “traditional” paper and board making processes, and 2) through scenario work, draft technology needs for future fiber-based new products.

Concerning paper and board mak-ing technologies in particular, the goal is to think the papermaking process as a whole, instead of more traditional ap-proach of developing papermaking with-in its subprocesses. Figure1 describes the traditional thinking.

Paper and board making processes consist of dozens of consecutive sub pro-cesses. Due to several reasons (such as the sub processes requiring detailed and profound special expertise) the develop-ment in companies as well as in univer-sities has been divided to different (and isolated) branches of expertise. In turn, this has led to very stiff boundary condi-tions between these sub processes tech-nology wise. This has resulted in low free-dom in innovating whole new papermak-ing processes. The development has then reduced into incremental sub-optimiza-tion.

Figure 2 presents different approach-es on process industry. Left side repre-sents declining prices/cost hunting race and its variables. Cost reduction in forest industry consists of both everyday opera-tion optimization work and periodical im-provements in technology – sometimes even revolutionary. The sum effect, how-ever, is the declining profitability develop-ment we see in forest industry today. Ba-sically there is no reason why the funnel would not continue tightening in the future.

 

Figure 1. Separate subproces oriented development approach

Page 141: Forestcluster EffTech programme report

141

The right side presents the effect of new value added product and service con-cepts forest industry may introduce in the future. The project produces technology needs for future development through scenario and innovation work. These needs are then converted to actual proj-ect proposals for Forestcluster’s ongoing and upcoming research projects.

4. Results

4.1 Visions of 2030 Finnish Forest Industry Kalevi Rantanen studied according to pat-terns in technological development the possibilities for year 2030 forest prod-ucts, and what kind of forest industry and economy can exist in Finland in 2030. In this work, future laws of future scholar Prof. Emeritus Osmo A. Wiio and technol-ogy development laws of Genrih S. Alt-shuller were utilized.

In year 2030 in Finland, everyone has

 

Figure 2. Business approaches on process industry for existing business (left) and new value added businesses (right).

thousands of computers in their disposal, of which most are made of paper. Cellu-lose based e-paper products flourish, but also on traditional printing paper – con-trary to many futurologists – text infor-mation is still printed and it is consumed. Intelligent paper – or e-paper – talks, hears, illuminates, communicates and collects its own energy from the environ-ment. Intelligent and biodegradable sen-sors are distributed to forests and trees. Then, a wood (and the forest together) will determine autonomously to be ready to be harvested and calls wirelessly a ro-bot to do the harvesting.

Cellulose molecules and the chemis-try of wood is generally learned to un-derstand and control in a way to gener-ate whole new forest economy branches. Cellulose electronics, forest biopolymers and cellulose composites change our ev-eryday life. New kind of wood biomasses are used to produce (fully automatized and mass production) very inexpensive mass-tailored houses. Intelligent cellulose textiles replace cotton. Genetic modified

Page 142: Forestcluster EffTech programme report

142

trees produce tailor-made industrial ma-terial, food supplements and medicine.

Forest tourism has expanded in a way that Everyman’s Right has been forced to be controlled. A debate of controlling the wandering of wolves with microchips is going on.

4.2. Forest Industry Scenarios for 2050 Starting from the widely used classical consumption forest products forecasting i.e. correlating the consumption forecast as a function to GNP/product price/popu-lation, three alternative scenarios for the mentioned regions were proposed:• Scenario 1: Here, paper and

board consumption forecast are made based on GNP and earlier consumption development

• Scenario 2: Country-by-country GNP was expected to grow only 1/3 what was predicted, and long-term demand flexibility was set as maximum of one.

• Scenario 3: Here, paper and board consumption are expected to follow population at a point of certain GNP and paper consumption has been reached.

The results can be used to draw the following conclusions:• Despite of the different consumption

scenarios give varied results in paper and board products demand, in every scenario Asia – especially South and East regions – show the fastest growth rates.

• In Europe, the fastest growing region is the East Europe; the rest of the Europe growth is highly dependent on the assumption whether or not the future growth will continue to follow the GNP development.

• Here, it can be good to stop for a

minute and think if a given paper consumption level per capita is actually already reached in the countries of Western living standards. The additional growth would then be due to increase in population only – which also is not great in Western Europe.

• North America consumption increase rate was expected to be somewhere between 0.5 and 1.8%. If, again, the assumption of maximum consumption per capita is applied also here, the potential growth rate will be close the bottom limit. Both scenarios 1 and 2 expect also North America consumption to start declining in the future.

4.3 Roadmaps TuPaKat-project had one goal of pro- ducing some roadmaps on reduction of energy and raw material use, ideal pa-permaking and fiber-based products for printed electronics. Timeframe selected for the work was 2010 to 2030.

Science, technology and applications needs joint vision, In order to be capa-ble of building applications needed at a given time with proper price. These ef-forts require communication, where road-mapping-process can be used as one tool. Linkage between science and applications is therefore required, but at the same time some flexibility is needed, to find the potential applications and new ways to solve given problems.

The three roadmaps made in this project serve as an example of the link-age and flexibility. First roadmap was made for existing products with a view-point of technology, science and applica-tions existed, having flexibility in options to fulfil the energy savings needs using dif-ferent technological solutions. Second road-map had flexibility also in technology, but solutions needed were faced from needs,

Page 143: Forestcluster EffTech programme report

143

not technologies available today. Of course, there were already some solutions avail-able to fulfil the requirements. Third road-map had highest freedom, and pace of de-velopment in the area is very rapid. Market growth potential is highest there. Growth potential in the first and to some extent second roadmap is more moderate, but substantial market already exists.

First roadmap showed that halving energy costs requires combination of ma-ny technological solutions, but it showed that it is possible. The possible savings

Figure 3. Development paths from state of the art towards statuses of ideal papermaking world, with research actions described in between at the path to ideal papermaking.

in each technology can be combined in a way found later suitable. Potential of each technology by grade varies greatly.

Second roadmap showed the ideal pa-permaking process and ideal unit process-es (Figure 3). For some unit processes a concept was also proposed

Third roadmap showed the printed electronics potential and applicability in fibre based substrates. OLED & memory market was has extremely high growth rates which in combination with fast de-velopment is a challenging, and desirable.

Page 144: Forestcluster EffTech programme report

144

4.4. Concept recommendaationsThe project group proposed four most in-fluential technology concepts for future papermaking :open base sheet + func-tional finishing; foam line; agile and mod-ular papermaking; pressless papermak-ing.

4.4.1 Open base sheet + functional properties through finishingUsing today’s vacuum roll parameters (vacuum levels and dwell times using commercial vacuum rolls), it is possible to fully impregnate liquids (fillers, binders, various fiber fractions etc.) thoughout pa-per and board products. The technology opens new possibilities for utilizing high consistency and waterless forming tech-nologies (open and weak base sheets) for adjusting the final product properties, as well as utilizing various Non-Woven prod-ucts and Fiber-based composites impreg-nation. One important aspect of the tech-nology is its potential in introducing whole new business and product possibilities on fiber-based base paper and impregna-tion of different functional chemicals, ad-ditives and fillers. These businesses can then utilize the infrastucture of forest in-dustry and a great deal of its existing web forming technologies.

The following items should be ad-dressed in order to approach the defined goal:• Studying less-water, pressless press

(low density) base sheet and other open sheet techniques

• Impregnation studies: more detailed sheet scale testing, planning for dynamic impregnation testing facilities for SUORA environment, paper and board quality evaluations based on low-density base sheet + impregnation treated papers and boards

• Plasma treatments studies• Light surface treatment studies

4.4.2 Foam lineThis technique offers flexible tailoring of layers in web forming by enabling po-rous structures or very thin layers. It al-so enables the utilization of nanomate-rials in web forming. The benefits of the techniques are layer purity and excellent sheet uniformity, as well as, the possi-bility to engineer product properties with smaller amount of raw material and less energy use in production.

Future action will include research on (e.g.)• Tailoring of foam chemistry o Screening of surfactants o Characterization of foams;

measurement techniques o Evaluation of the process;

foaming of material/mixing of material into foam

• Foam rheology o Rheological properties of

different kinds of foam-fiber suspensions.

o Foam handling in processes.• Multi-layer products: demonstration

of product properties in laboratory scale

• Multi-layer foam forming: scale-up of the new concept

• Sustainability calculations for specified products

4.4.3 Agile and modular papermakingThe objective is to renew value creation by reducing investments per plant and driving flexibility and reuse with new ar-chitecture for forest industry• Reduce investments per plant o Standardisation and the use

of same delivery chains with other industries

o Reuse of plant modules• Promote Flexibility&reusability: o Smaller plants with shorter

lifetime at one site

Page 145: Forestcluster EffTech programme report

145

o One plant is capable of producing wide range of products:

o Reusable plant modules for new sites

o Innovations are enabled via open modularity

Objective of the research is the following:• a paper mill concept defined in this

project has shorter time span from capital cost to start of production, total investment cost will be lower, and the investment will be more secure with reusability in future projects.

• To define new business concepts related to agile manufacturing of fibre based products.

• Development of operating models for forest industry which utilize resource efficient processes.

The main actions include developing a new mode of operations and introduc-tion of architecture with key technologies and standardization. Here, the succesful modularization development work done at other brands of industries (such as mari-time industry and power plants) is intro-duced to the forest industry.

4.4.4 Pressless PapermakingIn many paper and board grades, paper bulk is one of the key properties of the paper. Bulk (or thickness) is often pur-sued due to need of paper stiffness and rigidity. Wet pressing and calendering, however, are responsible for essentially destroying the bulk of paper.

If water removal after forming and before evaporation (20–50%) could be able to do essentially without wet press-ing, paper bulk should be doubled. Here, if water removal from dry content of 22% to D.C. of 45–50% (typical wet press-ing dying amount) would be done with-

out pressing, the final paper bulk before calendering can be increased from 1.5 to 2.5…4.0. In other words, the base paper bulk can potentially be doubled. if final paper bulk would be able to be e.g. dou-bled, this would mean the following:• E.g. 80 g/m2 copy paper would be

reduced to 40 g/m2 with essentially the same functional properties (thickness, roughness etc. )

• Reducing BW by 50% will also reduce the needed drying energy accordingly

• More bulky and compressive sheet may be calendered more easily to required roughness => more bulk savings

How could one then remove the wa-ter without wet pressing? Capillary forces should be able to remove the free water. A ”press” dryness of 40 to 50% should be reachable (a practical press dryness!). Note: there should be less rewetting in pressless capillary pressing!

The following items should be ad-dressed in order to approach the defined goal:• Research on pressless press concept o Capillary and electro osmotic

transport studies o Process modeling o Strength development /

improvement at low density• Product research o Research how ultra-low density

(compressible) papers behave in printing processes

o Other paper and board properties analyses and studies for new ultra-low density grades

Page 146: Forestcluster EffTech programme report

146

5. Future plans

The results of the TuPaKat project have been used in preparation of the continu-ation of EffTech program. The results in-clude project preparation and roadmap work. Concerning paper and board mak-ing technologies in particular, the results help the industry to think the papermak-ing process as a whole, instead of more traditional sub-process approach of de-veloping papermaking. Therefore, sever-al line scale proposals for new technolo-gy platforms were introduced.

Page 147: Forestcluster EffTech programme report
Page 148: Forestcluster EffTech programme report

www.forestcluster.fi