Large-scale plant construction in Germany must rethink its business models to better meet customer requirements in the future and remain competitive. This is a central demand resulting from the joint study conducted by the VDMA‘ s large-scale plant manufacturers’ group (AGAB) and the consulting company PwC. According to the study, the market needs to move away from the previous, technology-focused business models and toward an offering characterized by digital and data-driven services. In other areas, nevertheless, the study considers German large-scale plant construction as a global leader, for example in cybersecurity or virtual reality (VR): 94% of plant manufacturers already see VR as the essential basis of future success. The study, which can be viewed free of charge and sees itself as a guideline for successful digitization, identifies a total of 18 key factors for successful transformation, including the development of intelligent logistics concepts and cooperation with value chain partners.
North Carolina State University researchers have developed a technique to improve the characteristics of engineered tissues by using ultrasound to align living cells during the bio-fabrication process. "We've reached the point where we are able to create medical products, such as knee implants, by printing living cells," says Rohan Shirwaiker, corresponding author of a paper on the work and an associate professor in NC State's Edward P. Fitts Department of Industrial & Systems Engineering. "But one challenge has been organizing the cells that are being printed, so that the engineered tissue more closely mimics natural tissues. "We've now developed a technique, called ultrasound-assisted biofabrication (UAB), which allows us to align cells in a three-dimensional matrix during the bioprinting process. This allows us to create a knee meniscus, for example, that is more similar to a patient's original meniscus. To date, we've been able to align cells for a range of engineered musculoskeletal tissues." To align the cells, the researchers built an ultrasound chamber that allows ultrasonic waves to travel across the area where a bioprinter prints living cells. These ultrasonic waves travel in one direction and are then reflected back to their source, creating a "standing ultrasound wave." The soundwaves effectively herd the cells into rows, which align with areas where the ultrasound waves and the reflected waves cross each other. "We can control the alignment characteristics of the cells by controlling the parameters of the ultrasound, such as frequency and amplitude," Shirwaiker says. To demonstrate the viability of the UAB technique, the researchers created a knee meniscus, with the cells aligned in a semilunar arc - just as they are in a natural meniscus. "We were able to control the alignment of the cells as they were printed, layer by layer, throughout the tissue," Shirwaiker says. "We've also shown the ability to align cells in ways that are particularly important for other orthopedic soft tissues, such as ligaments and tendons." The researchers also found that some combinations of ultrasound parameters led to cell death. "This is important, because it gives us a clear understanding of both what we can do to improve tissue performance and what we need to avoid in order to preserve living cells," Shirwaiker says. To that end, the researchers have created computational models that allow users to predict the performance of any given set of parameters before beginning the biofabrication process. One other benefit of the UAB technique is that it is relatively inexpensive. "There's a one-time cost for setting up the ultrasound equipment - which can use off-the-shelf technology" Shirwaiker says. "After that, the operating costs for the ultrasound components are negligible. And the UAB technique can be used in conjunction with most existing bioprinting technologies. "We have a patent pending on the UAB technique, and are now looking for industry partners to help us explore commercialization," Shirwaiker says.
Biomedical engineers at Duke University have developed an automated process that can trace the shapes of active neurons as accurately as human researchers can, but in a fraction of the time. This new technique, based on using artificial intelligence to interpret video images, addresses a critical roadblock in neuron analysis, allowing researchers to rapidly gather and process neuronal signals for real-time behavioral studies. The research appeared this week in the Proceedings of the National Academy of Sciences. To measure neural activity, researchers typically use a process known as two-photon calcium imaging, which allows them to record the activity of individual neurons in the brains of live animals. These recordings enable researchers to track which neurons are firing, and how they potentially correspond to different behaviors. While these measurements are useful for behavioral studies, identifying individual neurons in the recordings is a painstaking process. Currently, the most accurate method requires a human analyst to circle every 'spark' they see in the recording, often requiring them to stop and rewind the video until the targeted neurons are identified and saved. To further complicate the process, investigators are often interested in identifying only a small subset of active neurons that overlap in different layers within the thousands of neurons that are imaged. This process, called segmentation, is fussy and slow. A researcher can spend anywhere from four to 24 hours segmenting neurons in a 30-minute video recording, and that's assuming they're fully focused for the duration and don't take breaks to sleep, eat or use the bathroom. In contrast, a new open source automated algorithm developed by image processing and neuroscience researchers in Duke's Department of Biomedical Engineering can accurately identify and segment neurons in minutes. "As a critical step towards complete mapping of brain activity, we were tasked with the formidable challenge of developing a fast automated algorithm that is as accurate as humans for segmenting a variety of active neurons imaged under different experimental settings," said Sina Farsiu, the Paul Ruffin Scarborough Associate Professor of Engineering in Duke BME. "The data analysis bottleneck has existed in neuroscience for a long time -- data analysts have spent hours and hours processing minutes of data, but this algorithm can process a 30-minute video in 20 to 30 minutes," said Yiyang Gong, an assistant professor in Duke BME. "We were also able to generalize its performance, so it can operate equally well if we need to segment neurons from another layer of the brain with different neuron size or densities." "Our deep learning-based algorithm is fast, and is demonstrated to be as accurate as (if not better than) human experts in segmenting active and overlapping neurons from two-photon microscopy recordings," said Somayyeh Soltanian-Zadeh, a PhD student in Duke BME and first author on the paper. Deep-learning algorithms allow researchers to quickly process large amounts of data by sending it through multiple layers of nonlinear processing units, which can be trained to identify different parts of a complex image. In their framework, this team created an algorithm that could process both spatial and timing information in the input videos. They then 'trained' the algorithm to mimic the segmentation of a human analyst while improving the accuracy. The advance is a critical step towards allowing neuroscientists to track neural activity in real time. Because of their tool's widespread usefulness, the team has made their software and annotated dataset available online. Gong is already using the new method to more closely study the neural activity associated with different behaviors in mice. By better understanding which neurons fire for different activities, Gong hopes to learn how researchers can manipulate brain activity to modify behavior.
Natural rubber from rubber trees is a raw material with a limited supply. Synthetically produced rubber, on the other hand, has not yet been able to match the abrasion behavior of the natural product, rendering it unsuitable for truck tires. But now, for the first time, a new type of synthetic rubber has been developed that achieves 30 to 50 percent less abrasion than natural rubber. Truck tires have to put up with a lot: As a result of the heavy loads they carry and the long distances they travel every day, they are subject to heavy wear and tear. Conse-quently, the treads of the tires are manufactured primarily from natural rubber that comes from rubber trees and to date has demonstrated the best abrasion characteris-tics. Before now, artificially manufactured rubber has been unable to match the perfor-mance of natural rubber, at least in this respect. The problem with natural rubber is that the security of supply for this important raw material is endangered. In Brazil, the original home of the rubber tree, the fungus Microcyclus ulei is laying waste to whole plantations. If the fungus crosses over to Asia, where major cultivation areas are lo-cated today, the global production of rubber will be threatened. Biomimetic synthetic rubber with optimized abrasion behavior (BISYKA) In view of this threat, researchers at the Fraunhofer Institutes for Applied Polymer Re-search IAP, for Microstructure of Materials and Systems IMWS, for Molecular Biology and Applied Ecology IME, for Mechanics of Materials IWM and for Silicate Research ISC have now optimized the characteristics of synthetic rubber. “Our synthetic rubber BISYKA – that’s a German abbreviation for “biomimetic synthetic rubber” – actually has superior characteristics to natural rubber,” says Dr. Ulrich Wendler, who heads up the project at the Fraunhofer Pilot Plant Center for Polymer Synthesis and Processing PAZ in the German municipality of Schkopau. Fraunhofer PAZ is a joint initiative be-tween Fraunhofer IAP and Fraunhofer IMWS. “Tires made of the synthetic rubber lose 30 percent less mass than equivalent tires made of natural rubber. On top of that, the synthetic tires have only half the tread loss. Furthermore, the synthetic rubber can be produced on an industrial scale using existing plants and equipment. This means that the synthetic rubber offers an excellent alternative to natural rubber – including the do-main of high-performance truck tires.” Targeted analysis of dandelion rubber But how did the researchers achieve this higher performance? At Fraunhofer IME, scientists investigated rubber from dandelions. Like the rubber from rubber trees, 95 percent of dandelion rubber consists of polyisoprene, while the remaining percentage is made up of organic components such as proteins or lipids. The advantage of dande-lion rubber over tree rubber: the former has a generation succession of just three months as opposed to seven years for the latter. That makes rubber made from dande-lions an ideal starting point for investigating the influence of organic components on the rubber characteristics. To this end, the Fraunhofer researchers eliminated the key organic components involved in a targeted manner. After they had identified the or-ganic components that were important for abrasion behavior, the researchers at Fraun-hofer IAP synthesized the BISYKA rubber out of functionalized polyisoprene with high micro-structural purity and the respective biomolecules. Their colleagues at Fraunhofer IWM and IMWS then investigated the characteristics of the rubber variants thereby ob-tained. To do this, they used extensional crystallization: If you stretch natural rubber to three times its length, crystalline regions form – the rubber hardens. “The extensional crystallization of BISYKA rubber equals that of natural rubber,” explains Wendler. When making truck tires, the rubber is usually mixed with carbon black – which is where the black color comes from. Increasingly, however, manufacturers are adding sil-icates to the mixture instead of carbon black. This is where the expertise of Fraunhofer ISC comes in: At the institute, scientists investigate how new kinds of silica fillers can lead to optimum alternatives to natural rubber in the automotive industry. Synthetic rubber yields impressive results in practical tests After the development of the BISYKA rubber, it was tested: Would it do what its exten-sional crystallization promised? The researchers handed over this question to an exter-nal and thus independent partner to investigate: Prüflabor Nord. For this purpose, four car tires were manufactured with a tread made from BISYKA and they were then com-pared with tires with a tread made from natural rubber. The tests were carried out di-rectly on a car that drove 700 circuits in one direction and then 700 circuits in the other direction. And the result? While the natural rubber tire was 850 grams lighter after the test and lost 0.94 millimeters of tread, the BISYKA tire lost merely 600 grams and 0.47 millimeters of tread. The rolling resistance of the synthetic rubber was also better: While the natural rubber achieved a score of C on the traffic light labelling of the rolling resistance, BISYKA achieved the higher score of B. “So far, we have only carried out initial tests with the BISYKA tire blend, but they are extremely promising. As the next step, we want to further optimize the BISYKA rubber. This concerns above all the proportion and the composition of the organic components. At the same time, we will adapt the formula of the tread compound for truck tires to the new rubber,” says Wendler. Currently the researcher and his team are looking for cooperation partners who will bring the product to the market.
Throughout the day, our skin is exposed to a range of hostile elements: wind, rain, sunlight, central heating, vehicle emissions... It is vital therefore to ensure it receives proper care. Most importantly, this means choosing a skincare product that is suitable for your type of skin. Fraunhofer researchers have now come up with a commercially viable method of producing a facial skincare product that is precisely tailored to the actual condition of your skin. What’s more, it contains nothing but essential ingredients. Drugstore shelves are crammed with different face creams. But which one is best for your particular type of skin? Making the right choice can be a tricky business, not least because other factors are involved, such as the season of the year, your current stress levels, hormonal balance and age. All of these can have a significant impact on the current condition of your skin. Lots of women would love to have a face cream that is just right for their needs. Such a personalized skincare product is now available from the company Skinmade. A spin-off from the Fraunhofer Institute for Manufacturing Engineering and Automation IPA, Skinmade has been set up by Viktor Balzer, an industrial engineer at Fraunhofer IPA, and Dr. Lars Rüther, a molecular biologist from Dermatest GmbH. Skinmade’s Personal Skin Care is already on sale at three outlets of the cosmetics chain Douglas – in the German cities of Frankfurt, Hamburg and Sindelfingen – and is scheduled for rollout in all major German cities by the end of 2019. Batch-size-one production at low unit costs Developed on the basis of five years of research plus the combined know-how of a multidisciplinary team of IT specialists, engineers, dermatologists, pharmacists and biologists, this innovative product has undergone extensive dermatological testing with live subjects and is fully compliant with the EU regulation on cosmetic products. At its heart is an innovative cyber-physical production system that will deliver a jar of personalized skin cream at the press of a button – and for a reasonable price. The cost to the consumer for a 30-milliliter jar of Skinmade Personal Skin Care is 40 euros. The touchscreen-operated unit comes in a compact housing the size of a closet. This comprises the entire production process, starting with technology to measure the skin’s natural hydration and lipid levels, and including the production control system as well as all the raw materials plus jars and caps for the cream. At the end of the process, the finished product is proffered on the machine’s dispensing tray. “We’re basically talking here about ‘batch-size-one’ production, i.e., about being able to produce customized products on a profitable basis,” says Balzer. In effect, he and a research team from Fraunhofer IPA have developed a patented system that enables the production of batch-size-one items at an increasingly low unit cost thanks to the exploitation of economies of scale and other synergies. “A customized product normally entails a high unit cost. But we’ve been able to get around this and produce our cream for a relatively low retail price. Not that I’m going to tell you how our patented system works! Let me just say that the various dosage, homogenization and purification steps are all integrated in one single process. That’s what makes it so fast.” The system is capable of dosing very fine concentrations of individual ingredients in quantities as small as three microliters. Also underpinning the success of this concept is the team’s specialized knowledge in the field of dermopharmacy – i.e., the effect that certain ingredients have on key biomarkers. Produce your personalized cream in seven minutes The first step is to measure the skin’s natural hydration and lipid levels with spot checks on the forehead, cheek and below the corner of the mouth. This process measures various biomarkers in order to gage the actual condition of the skin. Using a method known as corneometry, the level of skin hydration is determined. This relies on a measurement of the relative permittivity of the uppermost skin layer, the stratum corneum, which is around 20 micrometers in depth. Another method – sebumetry – serves to quantify the level of surface lipids on the skin. This is based on a technique known as grease-spot photometry. Here, an opaque strip of material is placed on the skin for a period of 30 seconds. Contact with the sebum in the skin renders the strip translucent. Analysis of the degree of translucency then serves to determine the lipid levels in the skin. A further technique is employed to measure the skin’s elasticity. The data from these various measurements is then analyzed by means of self-learning algorithms and neural networks programmed by the team. On this basis, the system can then calculate how much of which ingredients should go into the personalized skin cream. Balzer and his team have also put together the training datasets required for machine learning. The entire cyber-physical production system is controlled by means of a cloud solution. Once the data have been analyzed, the results are fed into the machine’s control system. A mere seven minutes later, the customer is presented with a 30-millimiter jar of face cream customized to their precise skin requirements. Customers can also specify their preferred fragrance and consistency. Skinmade recommends repeating the analysis after six weeks. That way, it can be determined whether the skin has altered and whether a new formulation might be more appropriate. In other words, customers can be sure of always getting a cream that is right up to date with their current skin condition. “A standard skin cream can never be as effective as a customized one,” says Rüther. “It’s possible that a product from the shelf contains ingredients at a concentration you don’t need. That means you might end up getting too much or too little. Plans for a mini skin analyzer for home use In the future, customers will also be able to book skincare consultants – equipped with a mobile skin analyzer – for home visits. Following analysis, skin data will be processed in the cloud. The finished product is then dispatched by mail. Alternatively, customers will be able to order their very own mini skin analyzer online, complete with special app, and then carry out their own skin measurements, before sending the data to Skin-made. This option is scheduled for rollout in 2020. Meanwhile, the team at Skinmade is busy exploring further customized items. The latest project is to put together a bespoke set of personal care products – comprising cleanser, tonic and serum – in which each item is perfectly matched with the others.
Sensors in autonomous vehicles have to be extremely reliable, since in the future motorists will no longer constantly monitor traffic while underway. In the past these sensors were subjected to arduous road tests. The new ATRIUM testing device from Fraunhofer Institute for High Frequency Physics and Radar Techniques FHR now makes it possible to move a large portion of these road tests to the laboratory. ATRIUM puts on a show for the vehicle’s radar sensor, generating artificial scenery that comes very close to the actual conditions encountered in street traffic. The car of tomorrow will drive itself. Passengers will move down the road as if being driven by a private chauffeur while enjoying conversation, reading a newspaper or perhaps watching a video. Although driver assistance systems such as automatic distance control are no longer new to the market, it will still be several years before completely autonomous cars take to the streets. This is because the technology involved has to be absolutely reliable. The sensors are the deciding factor here: For example, today’s radar sensors are already capable of independently detecting obstacles and applying the brakes in case of danger. These and other sensors are rigorously tested before being installed in the car. And autonomous vehicles require an even higher level of reliability, since if the driver is no longer at the wheel, the vehicle manufacturer may well be ultimately responsible for avoiding an accident. That is why automobile manufacturers have relatively high demands when it comes to sensor reliability. They demand sensors that cause no more than a single error over driving distances of several million kilometers, which means that today’s cars often have to complete very long road tests. “That’s a lot of kilometers,” says Dr.-Ing. Thomas Dallmann, Leader Research Group Aachen at the Fraunhofer Institute for High Frequency Physics and Radar Techniques FHR. “On top of that, multiple sensors have to be tested in order to statistically prove their reliability. This means several test vehicles with sensors have to spend quite a long time on the road.” Another difficulty: If an error occurs after several thousand kilometers, the sensor has to be optimized and the road tests have to begin all over again, an extremely time-consuming process. Moving road tests to the lab To simplify this situation, attempts are being made to simulate reality and bring the road tests into the laboratory. This type of laboratory test already exists for radar sensors. Radar sensors emit a radio signal that is reflected by various objects. Based on the echo, electronic sensor systems can then analyze the surroundings, measuring the distance to detected objects and the speed at which they are moving. This principle has already been simulated in the laboratory using what are referred to as radar target simulators. These simulators collect the radar waves emitted by the vehicle radar and modify the radar signal to behave as if it had encountered objects. The simulator then returns the information to the car in the form of an artificial echo image. Thus the radar target simulator generates a simulated landscape for the vehicle’s radar. The advantage is obvious: The test rig with a car radar and radar target simulator can run in the laboratory day and night, without having to put a car onto the street. Unfortunately, the few radar target simulators available on the market today are nowhere close to being able to generate a complete echo landscape. “Most of the models can only generate a highly restricted image with a single-digit number of reflections returned to the car’s radar,” says Dallmann. “That’s an extremely small number compared to the situation in a natural environment.” After all, real scenery contains hundreds of reflecting objects: people, cars, trees, traffic signals. Even a single vehicle in traffic can generate various reflections from different angles, for example a passenger car whose bumpers, wheels and side-view mirrors reflect differently. “We’re still very far removed from a realistic setting when it comes to testing sensors for autonomous driving,” the engineer continues. Radar target simulator generates as many as 300 reflections That is why Dallmann and his team are developing a new, higher-performance radar target simulator called ATRIUM (the German acronym for “Automotive test environment for radar in-the-loop testing and measurements”), capable of generating significantly more reflecting objects. The current goal of the Fraunhofer FHR is to be able to generate 300 reflections by the time the project ends, a tremendous objective. “This will mean that ATRIUM can present the car’s radar sensor with a relatively true-to-life scene, something like a drive-in movie for the radar sensor.” Since a patent application has been filed for the ATRIUM technology, Thomas Dallmann cannot yet reveal any details. But he can say: “We have optimized the structure of the transmission channels, making them much more cost-effective. As a result, the reflections can be represented in such a way that they reach the radar from a number of different directions.” This could make it possible to test new sensors for autonomous vehicles in full scope and under highly realistic conditions in the lab. “In the future, we’ll be able to run highly complex tests, which will make it possible to greatly reduce the time involved in road tests.”
The grid is changing as the big, centralized providers of the past are replaced by smaller, distributed suppliers. Keeping such complex networks running stable requires high-resolution sensor technology – AI provides a way to make accurate predictions and automatically detect any disturbances or anomalies in real time. Here is how Fraunhofer researchers developed the compression techniques, algorithms and neural networks to make a power supply fit for the future. The way power is generated is in transition: Whereas, before, all our power came from big power plants, these days it comes from a range of distributed sources as well, including wind turbines, photovoltaic systems and other similar facilities. This shift has a big impact on our grid – with particular challenges for operators of transmission grids. How to monitor the proper functioning of grid parameters such as phase angle and frequencies? Might there be discrepancies or anomalies in the proper functioning of the grid? Or are there lines or power plants down? Today’s standard measurement technology is no longer able to reliably furnish answers to these sorts of questions. More and more operators are, therefore, turning to additional phasor measurement units (PMUs) and other digital solutions. These systems measure the amplitude and phase of current and voltage up to 50 times a second. This process generates huge volumes of data, easily several gigabytes a day. Data compression saves 80 percent of data In response, researchers at the Advanced System Technology (AST) branch of the Fraunhofer Institute for Optronics, System Technologies and Image Exploitation IOSB in Ilmenau are looking for ways to optimize the data processing using artificial intelligence, with a view to improving grid reliability and establishing a power supply system fit for the future. “We can use AI to automatically log, compress and process up to 4.3 million data sets per day,” says Prof. Peter Bretschneider, head of the Energy department at the AST branch of the Fraunhofer IOSB. In the first phase of their work, the researchers have come up with a compression technique that saves 80 percent of the data. Not only is it easier to store the data, but faster and more efficient to process it too. Automated data processing in real time In the second phase, the researchers went on to utilize the phasor measurement data they had collected to apply neural networks – one of the key components for today’s artificial intelligence. More specifically, they “fed” the neural networks with examples of typical system outages. This way, the algorithms gradually learn to distinguish – and precisely categorize – normal operating data from defined system malfunctions. Following the training phase, the researchers applied the neural networks to current data generated from phasor measurements – data that previously had to be taken and manually processed. This is where the algorithm made its first leap into real-time application, making split-second decisions on where there is an anomaly or fault, as well as the type and location of that disturbance. To take an example, if one power plant should fail, an abrupt spike can be expected in the load placed on the other power plants. The increased load slows down the generators, and the frequency decreases. This calls for rapid countermeasures because if the frequency sinks below a threshold value, the operator may be forced to cut off sections of the grid for the sake of system stability. And by rapid, we are talking about less than 500 milliseconds. Since the algorithm is capable of reaching a decision within 20–50 milliseconds, that leaves sufficient time to implement the appropriate fully automated countermeasures. The algorithm is ready to be implemented, as the researchers continue to work on the control and regulation of the relevant countermeasures. The development is of interest not only to the big operators of power transmission grids, but also to regional distribution grids. “To make an analogy with the road network, what’s the point of having clear motorways when the smaller regional roads are permanently blocked?” says Bretschneider. Power to predict problems of the future All the same, the researchers are not restricting themselves to the problems of today, but also want to factor in anomalies that have not even occurred so far. “If we continue to pursue renewables, it may lead to situations we don’t even know about yet,” says Bretschneider. Here, too, the researchers have turned to artificial intelligence, where they work on categorizing these sorts of unknown phenomena and developing the appropriate algorithms using digital network maps.
▣ CATEGORY OF COMPANIES (NACE Rev.2) 1. Computer programming, consultancy and related activities 2. Information service activities 3. Motion picture, video and television programme production, sound recording and music publishing activities 4. Programming and broadcasting activities 5. Publishing activities 6. Telecommunications
If you need bolts with a groove for a new design, are suffering from a supply bottleneck or want to purchase spare parts, then a visit to the stand hosted by mbo Osswald GmbH & Co. KG in Hannover would be a good idea. The company is at the fair to show how the mbo Osswald bolt configurator offers a fast and straightforward route to the bolt you need. Whether you're looking for tried-and-tested DIN bolts or bolts with a groove, this tool promises a speedy and user-friendly solution that will meet all your needs. Using the mbo Osswald bolt configurator is a pretty simple task. First, combine the bolt form, material, shaft diameter and retainer version with the required, freely definable length or grip length. The configurator will then automatically calculate the required minimum dimensions at the touch of a button, create the corresponding dimensional drawing and display the price and current delivery time. After that, all you have to do is enter the required quantity and transfer everything to your shopping cart. What's more, adding the relevant retainer type in the required quantity to your cart and placing an order couldn’t be easier - simply check the relevant box. Users can choose between bolts with or without a head in steel and stainless steel. The all-round slot applied to the shaft can be configured for various standard retainer types - locking washer DIN 6799, retaining ring DIN 471, SL-retainer, KL-retainer and bayonet clip.
A healthy adult makes about 2 million blood cells every second, and 99 percent of them are oxygen-carrying red blood cells. The other one percent are platelets and the various white blood cells of the immune system. How all the different kinds of mature blood cells are derived from the same "hematopoietic" stem cells in the bone marrow has been the subject of intense research, but most studies have focused on the one percent, the immune cells. "It's a bit odd, but because red blood cells are enucleated and therefore hard to track by genetic markers, their production has been more or less ignored by the vast number of studies in the past couple of decades," said Camilla Forsberg, professor of biomolecular engineering in the Baskin School of Engineering at UC Santa Cruz. In a new study, published March 21 in Stem Cell Reports, Forsberg's lab overcame technical obstacles to provide a thorough accounting of blood cell production from hematopoietic stem cells. Their findings are important for understanding disorders such as anemia, diseases of the immune system, and blood cancers such as leukemias and lymphomas. "We're trying to understand the balance of production of blood cells and immune cells, which goes wrong in many kinds of disorders," Forsberg said. The process by which hematopoietic stem cells give rise to mature blood cells involves multiple populations of progenitor cells that become progressively more committed to a specific "fate" as they develop into fully mature cells. A major fork in the road is between "lymphoid progenitors," which give rise to white blood cells called lymphocytes, and "myeloid progenitors," which give rise to other kinds of white blood cells, as well as red blood cells and platelets. The majority of cells in the bone marrow are in the myeloid lineage. A key finding of the new study is that all progenitor cells with myeloid potential produce far more red blood cells than any other cell type. This was surprising because many previous studies in which progenitor cells were grown in cell cultures ("in vitro") found they had limited capacity to produce red blood cells and platelets. Forsberg said those results now appear to be an artifact of the culture conditions. "It's been hard to make sense of a lot of those experiments, because we know our bodies need to make a lot of red blood cells and platelets," she said. "Our results show that these progenitor cells retain a lot of red blood cell potential. In fact, we propose that red blood cell production is the default pathway." In experiments led by first author Scott Boyer, a graduate student in Forsberg's lab, researchers transplanted different progenitor cell populations into mice and tracked the production of red blood cells as well as platelets (the second largest component of blood) and immune cells. Boyer was also able to transplant single progenitor cells and then identify the blood and immune cells it produced. By quantifying the numbers of mature blood cells produced from transplanted progenitors, the researchers were able to show that red blood cells were by far the most abundant cell type produced by every type of progenitor cell, with the exception of lymphoid progenitors. Their findings led to the development of a model of hematopoietic differentiation that focuses on red blood cells as the default pathway for all myeloid progenitors. In addition to Forsberg and Boyer, the coauthors of the paper include Smrithi Rajendiran, Anna Beaudin, Stephanie Smith-Berdan, Praveen Muthuswamy, Jessica Perez-Cunningham, Eric Martin, Christa Cheung, Herman Tsang, and Mark Landon, all at the UC Santa Cruz Institute for the Biology of Stem Cells. This work was supported by the National Institutes of Health and the California Institute for Regenerative Medicine.