Pesquisar este blog

Translate

quinta-feira, 30 de julho de 2015

No Small Task: Generating Robust Nano Data

Used under creative commons license from brookhavenlab.
Visualizing and measuring materials at the nanoscale:
Center for Functional Nanomaterials at the Brookhaven National Laboratory.
A slogan that summarizes NGO and European Union Parliament requirements for regulating products of nanotechnology is “No data, no market.” But what kind of data and for what kind of market? I participated in a National Nanotechnology Initiative (NNI)/Consumer Products Safety Commission (CPSC) workshop, “Quantifying Exposure to Engineered Nanomaterials from Manufactured Products,” (QEEN) to get answers to those and related questions. The CPSC, whose budget was described by one of its officials as a “rounding error” relative to other NNI agencies’ budgets, co-organized an excellent workshop dedicated to producing data to protect consumers. According to both academic and regulatory scientist presentations at QEEN, it is no small task to generate reliable, good quality data to measure the exposure of humans, animals and the environment materials ranging from atomic to molecular-size that have been advertised as the basis for the 21st Century Industrial Revolution.

At the opening of QEEN, the pressure on the scientists to deliver the data to enable regulatory permits to commercialize nano-products was expressed by the assistant director of the presidential Office of Science and Technology Policy (OSTP). Dr. Lloyd Whitman said that fifteen years after the launch of the NNI, it was time for NNI 2.0, the era of nanotechnology commercialization. However, Dr. Whitman talked about how a new “ Environment, Health and Safety (EHS) ecosystem” with a faster throughput of data for evaluating risks would be needed. He announced an OSTP call for the scientists to pose (and later achieve) solutions to “Nanotechnology-Inspired Grand Challenges.” But first it would be necessary for the scientists to resolve the pesky problem of generating data.

The scientists identified three interrelated problems in generating data that reliably would inform regulators which nanomaterials in which products at what point in their life cycle would pose “unreasonable risks,” (a term in U.S. law) to consumers, workers and the environment. I found the three problems crucial to understanding exposures that could result from agri-nanotechnology products in the research and development pipeline, including nano-enabled pesticides, fertilizers and food packaging materials, for consumers, farm workers, rural communities and the environment.

First, to get realistic data about possible risks of nanotechnology-enabled products as they are used, it will be necessary to have the cooperation of nanotechnology product developers. The scientists must evaluate nanomaterials in their product matrix, e.g. nano silicon dioxide in a dry soup mix or nano titanium in sunscreen ointments. This kind of evaluation is a different, and more difficult, task than safety assessment of the pristine nanomaterials that scientists synthesized and studied during the first decade of the NNI.

However, that necessary cooperation by the nanotech industry has not been forthcoming, since there is no rule to compel it to submit nanotechnology-enabled products and data about those products for pre-market safety assessment. The scientists have had to resort to informal networks to get unofficial product samples, the risk assessment of which may have scientific validity but not regulatory validity, since the products have not been obtained with the cooperation of the product developer who seeks commercialization.

Second, many of the relevant experiments to simulate the “weathering,” or use of nano-enabled products over time, to obtain realistic exposure data often require expensive equipment and repeated trials. Sometimes such equipment is not available over the longer timeframe needed for exposure studies. Such practical considerations are crucial in determining how and how many nanoparticles are released from their product matrix, which, in turn, determines human and environmental exposure. But scientists do not yet understand what triggers particles to release under what conditions for many nanomaterials, so more experiments will be required to get realistic exposure data.

One academic scientist said that few commercial toxicology labs currently have the equipment and training to detect nanomaterials in products. How could their testing capacity be enhanced with less costly equipment than that used by government agencies and major research universities? One regulatory scientist indicated that currently there is no way to validate techniques for environmental modeling. This means there is data generated by experiments, but no database that will inform a lab reliably on the “fate and transport” of nanomaterials, i.e. where they will go and with what environmental, health and safety effects. There is an urgent need to publish the data sets, as well as the nanotechnology research papers funded by the government, so that scientists can mine the data sets outside their own research to be able to predict the environmental, health and safety effects of a nanotechnology application.

U.S. government funding has not been as forthcoming for exposure studies as it had been for determining nanomaterial hazards, such as unquantified potential for toxicity or mutagenicity. Another academic scientist said that he believed the use of carbon nanotubes (CNT) to strengthen polymers, such as those used in automobiles bumpers, could not become widespread without developing data to determine the effect of worker exposures to CNTs and to develop adequate protective equipment and manufacturing procedures to minimize risks to workers.

A third problem is that the anticipated increased complexity of nanomaterials and their multiple insertion points in products and the human and natural environment require a research strategy that groups similar materials for experimentation and risk assessment. There is scientific consensus, if not publicly available industry data, about which Top Ten nanomaterials are most widely used. However, there are no public databases that group nanomaterials on the basis of their electrical, chemical, magnetic, thermal and other properties, such as shape, particle distribution and other metrics.  Without such databases, regulator and industry demands for high throughput determination of environmental, health and safety effects cannot be realized. A smart industry would be happy to pay for such data bases as part of its sustainable business model.

The OSTP recommends that the EHS regulators evaluate each application for commercialization on a case by case basis, even if the EHS agency does not have the budget, infrastructure or personnel to do scientifically robust risk assessment on a scale. If the government and industry demanding the data are willing to pay for the experiments to generate it and the computer programming to organize the data into relatable groups, a product by product pre-market safety assessment is technically feasible.
However, in my view, the default procedure of inadequately resourced regulators under the current anti-regulatory siege in Congress likely will be to deregulate. Deregulation, in which neither industry nor the government is legally liable for product safety, may be an attractive alternative to trying to regulate product applications for which the agency lacks resources to conduct risk assessments on products based on robust exposure data of nanomaterials and nano-enabled products in realistic use simulations.
The science presented at QEEN was impressive in both its experimental design and ambition. While I cannot summarize the variety of experiments reported in a blog, nor, indeed, accurately report them in detail, some past NNI workshops have been reported on and included in some of the presentations, such as the recent report on the 2014 NNI workshop on nano-biosensors, in which IATP participated.

For example, it is now possible, as reported by Dr. Robert Mercer of the National Institute for Occupational Health and Safety, to visualize and count individual nanoparticles of different engineered nanomaterials in tissues sections samples from different sections of the lungs and lymph nodes of postmortem laboratory rats. After 12 days of inhalation exposure to Multi-Walled Carbon Nanotubes (MWCNTs), the agglomeration of MWCNTs can be visualized over a 336-day period in the lungs, lymph nodes, diaphragm, kidneys and brains of the exposed rats. Such experiments are crucial for determining the environmental, health and safety effects of MWCNTs and to designing industrial processes and protective equipment for those working with MWCNTs in an industrial setting.

Notwithstanding such an impressive experimental achievement in pathology, the technical limitations of  the life cycle modeling of how nanoparticles are transported in the air, water, blood, saliva and other fluids produce a wide range of uncertainty about data, e.g. for predictive toxicology. Variants of one scientist’s comment were repeated by many: despite a decade of research in life cycle modeling of nanomaterials, scientists still cannot respond to questions about human exposure assessment with the degree of precision and standardization required by regulators.  

There are nanotechnology jokes made about doing more with less, but if governments fail to finance adequately the research into human, animal and environmental exposures to nanomaterials, and if industry continues to fail to provide the scientists with nano-products to analyze in a regulatory process, regulators could allow a market for nanotechnology to develop without robust exposure data. The human and environmental health consequences of such a market, even if not resulting in acute toxicity, will not be funny at all.


Fonte: IATP