Nanotechnologies - General Concept for Pretty Large Amount of Pretty
Small Gadgets Embedded Into Something and Consequences for Design and Manufacturing
So fashionable is this topic at these days that even the main public media outlets are speaking on this every week. Many scientific and engineering centers and professionals already jumped in the bandwagon of nanotechnology.
As soon as we are in this business for more than two decades it is interesting also how people do understand and perform on this stage.
The general point of view is that the Nanotechnology is when something of pretty small size is embedded into some much bigger volume of some another nature matter and becomes the agent of influence starting from that tiny scale, where the ordinary size substance or tool is not applicable or it’s physical and chemical effects are of another nature because of the size mainly and onto the Upper scale. That’s true to the extent of the question of how many (and what kind) of these tiny agents (devices) should be considered to apply to gain some meaningful effect(s) to the Upper scale(s).
The answer to this question is – the huge amount of these tiny gadgets should be incorporated somehow to reach a noticeable result.
Because most if not all of these devices, agents, mechanisms or parts, are somewhat separate entities, separate from the medium in which they supposed to be embedded – so what, you can guess – this is the heterogeneous media with the few (at least two) scales. Scales can be spatial or of time of a process.
But this is what the HSP-VAT is all about.
We would like to start with the observations on the overall situation in this area as in terms what people understand and how they think it can be achieved. The comments below is necessary to explain the connections between the Nanotechnologies and the scaling HSP-VAT physics fields, those we address in this and other sections of this website.
The DOE report at:
“Theory and Modeling in Nanoscience”
Report of the May 10–11, 2002, Workshop
Conducted by the Basic Energy Sciences
and Advanced Scientific Computing
Advisory Committees to the Office of Science,
Department of Energy
has been completed with an intention to analyze the situation and find out or suggest the possible direction of work.
Few sentences from the report will help to describe the author’s vision and the arguments those I will present below and support all around in this website.
Most interesting for us is the chapter named -
A. Bridging Time and Length Scales
“Most phenomena of interest in nanoscience and nanotechnology, like many phenomena throughout science and engineering, demand that multiple scales be represented for meaningful modeling. However, phenomena of interest at the nanoscale may be extreme in this regard because of the lower end of the scale range. Many phenomena of importance have an interaction with their environment at macro space and time scales, but must be modeled in any first-principles sense at atomistic scales.”
“…Meanwhile, spatial scales may span from angstroms to centimeters (a range of 108). The cube of this would be characteristic of a three-dimensional continuum-based analysis. Counting atoms or molecules in a single physical ensemble easily leads to the same range; Avogadro’s number is about 1024.”
“ Therefore, nanoscience poses unprecedented challenges to the mathematics of multiscale representation and analysis and to the mathematics of synthesizing models that operate at different scales.... ”
“ Existing techniques may certainly yield some fruit, but we are undoubtedly in need of fundamental breakthroughs in the form of new techniques.”
“ Mathematical homogenization (or “up- scaling”)—whereby reliable coarse-scale results are derived from systems with unresolvable scales, without the need to generate the (out of reach) fine scale results as inter-mediates—is a success story with extensive theory (dating at least to the 1930s) and applications.”
A reader should ask at this point - What? We have an ability to find something (resolve, solve) “ ...without the need to generate the (out of reach) fine scale results as inter-mediates...” Without knowledge of that? - Without knowledge about that scale model, data, etc. Is this possible ?
The new superficial scientific power? No, this is the same scale solution, read in our - "Governing Equations and "Averaging" in Homogenization Theory" .
“ The wildest extrapolations of Moore’s Law for silicon cannot sate the appetite for cycles and memory in nanoscience modeling. It is clear that a brute force, first-principles approach to nanoscale phenomena will not succeed in bridging scales……”
Oh, this is really true.
Among few good techniques mentioned in this chapter – one find no real scaling theory (the homogenization theory is not a scaling theory – it does not connect both neighboring scales in two ways – up and down, and it can not describe correctly the heterogeneous media scaling), which can be considered as a comparative with the VAT. There is no mentioning of the VAT techniques at all – either people are not familiar or just ignore the field existing for more then 35 years! What is this?
As it is written in this review it seems that authors do not connect (or allow that phenomena won’t connect directly) the physical phenomena on the neighboring scales – scales, which are considered as when the one is with the finer scale and an another (Upper) scale with the coarser scale physical resolution. But if these scales are not directly tied – then: What is the Multiscaling at all in this ?
The authors answer to this question directly themselves – “The Holy Grail of nanoscale modeling is a seamless bridging of scales with tight bi- directional coupling between every scale- adjacent pair of a sequence of models, from nano through micro and meso to macro “.
The good characteristically reflection ( ) on my comments here is an example given in the same chapter – “A. Bridging Time and Length Scales”, an example of how the different scale phenomena should be connected in one model – “We mention here an example that appears to demand vast bridging of scales for unified scientific understanding, the operation of a biosensor to detect trace quantities of a biologically active molecule.“ (A good VAT example either).
· “At the smallest scale, a molecule can be sensed by the luminescence characteristic of an optical energy gap. Quantum Monte Carlo could be used for a high accuracy comparison with experiment.
· At the next scale, where one needs to gain a quantitative understanding of surface structure and interface bonding, DFT and first-principles molecular dynamics could be employed.
· At the next higher scale, one would need to gain a quantitative understanding of the atomistic coupling of the molecule to be detected with the solvent that transports it to the sensor. This modeling could be modeled with classical molecular dynamics at finite temperature and classical Monte Carlo.
· Finally, at the macroscale, the solvent itself could be characterized with a continuous model, based on finite elements.”
As we can see there are the 4 scales involved by authors of the report. Noticeably, at the same time that in this text cited above there is no words on – How the models would talk and transfer information between themselves? How the scale models will transform one to another?
But this is the major issue ?? Well, if the models are scaled between themselves, then see this website starting from – http://travkin-hspt.com/fundament/index.htm .
Meanwhile, quite another thing if they are just being relevant one to another? Then, no questions, the connection can be done in a million of frivolous modes.
Another example of this kind of understanding, is that one can observe in the website http://cmcs.ca.sandia.gov/
where the practical goals announced at the website are:
Architect and build an adaptive informatics infrastructure enabling multi-scale science
Ø XML data/metadata management services
Ø Chemical Science Portal enabling data-centric project- and community-level collaboration
Ø Middleware and tools for security, notification, collaboration
· Pilot project within combustion research community
Ø Enable rapid exchange of multi-scale data/pedigree
Ø Integrate chemical science tools that generate, use and archive metadata
· Demonstrate the power of adaptive infrastructure to existing and new areas as CMCS evolves
Ø Development environment for an evolving set of collaborative cross-scale science tools
Ø Develop collaborative data pedigree/annotation tools
· Gain adoption and continued support by science community participation
· Document success and continuation path.”
These tasks are to be performed with help of the computer science means for the chemical projects over the HOMOGENEOUS subject (matter), even multiscale ones. It is not about the multiscale multimedia chemistry and physics. No two-three phase chemical reactors legally should be involved.
And should not be substituted and meant as such.
“An Informatics Approach to Multi-scale Science
To overcome current barriers to collaboration and knowledge transfer among researchers working at different scales, a number of enhancements must be made to the information technology infrastructure of the community:”;
“We propose a Collaboratory for Multi-scale Chemical Science (CMCS) focusing on combustion research that will demonstrate that an integrated multi-scale approach to scientific and engineering research is not only possible but can produce significant benefits in harnessing research to address real-world issues. The field of combustion is critical to the DOE mission for clean and efficient energy, and the DOE has ongoing investments in research across the full range of relevant scales and disciplines. The CMCS will bring an integrated, informatics-based approach to combustion research that enhances and begins to automate the flow of information between sub-disciplines.”
So, this is the informatics for different scale chemical processes in combustion, separate scale phenomena database management, and no multiscale physics communications. At the same time it is being proclaimed that: “Combustion systems involve three-dimensional, time-dependent, chemically reacting turbulent flows that may include multiphase effects with liquid droplets and solid particles in complex physical configurations .”
“ The collaborative creation, discovery, and exchange of information across all of these scales and disciplines are required  to meet DOE's mission requirements.”
It is just a misunderstanding of what is the multiscaling. It is not the management and an exchange of databases. While at the same time the kind of “multi-scale chemical science (MCS) portal” should be an important instrument in the research development process, still it can not serve as a physics substitute tool.
One more example of how today researchers are looking for construction of separate scale models and then declaring that the next steps are needed to multiscale – is in the paper by
J.Guo, S.Datta, M.Lundstrom, and M. P. Anantam - “Towards Multi-Scale Modeling of Carbon Nanotube Transistors,”
- where “Multiscale simulation approaches are needed in order to address scientific and technological questions in the rapidly developing field of carbon nanotube electronics.”
In that study authors wrap up the text …”.. with a brief discussion of how these semi-empirical device level simulations can be connected to ab initio, continuum, and circuit level simulations in the multi-scale hierarchy.”
In this paper a reader can learn that:
“In the ballistic limit, states within the device can be divided into two parts: 1) states filled by carriers from the source according to the source Fermi level, and 2) states filled by the drain according to the drain Fermi level.”
“….tight-binding approximation to describe the interaction between carbon atoms, and only nearest neighbor coupling is considered.”
“Semiclassical approaches are the method of choice when scattering dominates, and phenomenological quantum corrections can be made.” “Finally, work at the device level needs to be coupled to circuit level models so that the system level implications of novel devices can be readily explored. Existing approaches may or may not be adequate.”
All these lengthy citations are provided here for the only purpose to make more open and vivid the current frantic situation with the multiscale modeling in Nanotechnologies. Actually, there is an improvement in already known schemes for singled out scale physical and mathematical models, not a connection (communication) of scales. We can cite much more studies only to stress out the obvious desperate claims toward “multiscaling,” which in reality are just semantic, database connections of various scales physics phenomena and models for those phenomena. Obviously, those are not a multiscaling.
As we speak out in the section “Atomic Scale Description of Matter with VAT,”
on the “pseudoscaling” approach which mostly is being used right now just to get funding in
such a fruitful field as Nanotechnologies - the following two
slides from Kintech Ltd.
exhibit the qualitative, "verbal" understanding of this subject
in most of the labs.
someone wants to combine the outstanding features of nanotechnology
as of the most relevant few we talk here, mainly those are:
1) that those tiny objects to be effective must be of enormous amount;
2) that we are interested in the bulk, lump sum effect of those embedded nano-objects,
which itself means that the Upper Scale modeling is the obvious goal;
then we understand that the manufacturing is not the only challenge,
but the design on each scale can not be separated from proper modeling on each scale and
from the Correct communication, exchange, transformation of physical quantities, fields
between the scales. This is the issue we can not skip around, conceal, or silence.
exhibit the qualitative, "verbal" understanding of this subject in most of the labs.
When someone wants to combine the outstanding features of nanotechnology as of the most relevant few we talk here, mainly those are: 1) that those tiny objects to be effective must be of enormous amount; 2) that we are interested in the bulk, lump sum effect of those embedded nano-objects, which itself means that the Upper Scale modeling is the obvious goal;
then we understand that the manufacturing is not the only challenge, but the design on each scale can not be separated from proper modeling on each scale and from the Correct communication, exchange, transformation of physical quantities, fields between the scales. This is the issue we can not skip around, conceal, or silence.
While to the topics of Atomic-subatomic collective phenomena scaled modeling and simulation we are addressing mostly in the - “ Atomic Scale Description of Matter with VAT,” in this section we turn our attention commonly to the mesoscales events as being approximately in the range 10(-8) - 10(-4)[m].
As long as the claims on "Multiscaling" continue filling out the mainstream materials science publications we will be analyzing them. Our few remarks on "pseudo-scaling" already have been posted in -
Meanwhile, in the following sub-sections of this section we intend to portray resent studies having contentions on multiscaling specifically in the Nanotechnology field. In most of these works there is even no intention to provide the Averaging at all ?!
Or to introduce the ground for "coupling," "scaling," apart as that of referring to the Grand computer power when doing that kind of modeling.