Welcome Login

You are here

LEADING EDGE TECHNOLOGIES (formerly critical technologies)

Thank you all for you continued feedback and input.  As we move toward our announced industry day on November 3rd, your comments help us greatly as we clarify and codify the message.  In that spirit, we have reconsidered our "Critical Technology" approach and elements.

The first change is in the naming of these technologies: We now refer to these technologies as Leading Edge Technologies, and there has been some morphing of some of the elements in the transition.  While those listed now are in keeping with the original post, some have branched out into separate elements, and others have been re-labeled for clarity for the purpose of keeping up with the current referential source that this community identified - www.techtarget.com (in my opinion a great find, thank you Industry!)

Please take a look at this new list, the abbreviated definition and the links to the expanded information. You may provide your comments keeping in mind the current thinking is to use these technologies as evidence of relevant IT experience. This list does not define or limit what will emerge into the full Scope of Alliant 2 or Alliant 2 Small Business 2.

As always, your comments are encouraged, but if you cannot post comments or prefer to remain anonymous, you can respond directly to me. Then - if you allow - I can share your comments without attribution.


Agile Development In software application development, agile software development (ASD) is a methodology for the creative process that anticipates the need for flexibility and applies a level of pragmatism into the delivery of the finished product. http://searchsoftwarequality.techtarget.com/definition/agile-software-development
Artificial Intelligence AI (pronounced AYE-EYE) or artificial intelligence is the simulation of human intelligence processes by machines, especially computer systems. http://searchcio.techtarget.com/definition/AI
Big Data Big data is an evolving term that describes any voluminous amount of structured, semi-structured and unstructured data that has the potential to be mined for information. Although big data doesn't refer to any specific quantity, the term is often used when speaking about petabytes and exabytes of data. http://searchcloudcomputing.techtarget.com/definition/big-data-Big-Data
BioMetrics Biometrics is the science and technology of measuring and statistically analyzing biological data. In information technology, biometrics usually refers to technologies for measuring and analyzing human body characteristics such as fingerprints, eye retinas and irises, voice patterns, facial patterns, and hand measurements, especially for authentication purposes http://searchsecurity.techtarget.com/definition/biometrics
Brain Computer Interface (BCI) Brain-computer interface (BCI) is a collaboration between a brain and a device that enables signals from the brain to direct some external activity, such as control of a cursor or a prosthetic limb. The interface enables a direct communications pathway between the brain and the object to be controlled. In the case of cursor control, for example, the signal is transmitted directly from the brain to the mechanism directing the cursor, rather than taking the normal route through the body's neuromuscular system from the brain to the finger on a mouse. http://whatis.techtarget.com/definition/brain-computer-interface-BCI
Cloud Computing  A cloud service has three distinct characteristics that differentiate it from traditional hosting. It is sold on demand, typically by the minute or the hour; it is elastic -- a user can have as much or as little of a service as they want at any given time; and the service is fully managed by the provider.  Cloud computing is also a general term for anything that involves delivering hosted services over the Internet. These services are broadly divided into three categories: Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS).  http://searchcloudcomputing.techtarget.com/definition/cloud-computing
Cryptography Cryptography is a method of storing and transmitting data in a particular form so that only those for whom it is intended can read and process it. The term is most often associated with scrambling plaintext (ordinary text, sometimes referred to as cleartext) into ciphertext (a process called encryption), then back again (known as decryption). http://searchsoftwarequality.techtarget.com/definition/cryptography
Cyber Security Cybersecurity is the body of technologies, processes and practices designed to protect networks, computers, programs and data from attack, damage or unauthorized access. In a computing context, the term security implies cybersecurity. Ensuring cybersecurity requires coordinated efforts throughout an information system. Elements of cybersecurity include:

Application security
Information security
Network security
Disaster recovery / business continuity planning
End-user education.

Data Center As A Service (DCaaS) A data center as a service (DCaaS) provider will supply turnkey physical data center facilities and computing infrastructure (e.g., servers, networking, storage, and so on) to clients in the form of a service. http://searchdatacenter.techtarget.com/definition/data-center-as-a-service-DCaaS 
Data Federation  Data federation technology, also called data virtualization technology or data federation services, is software that provides an organization with the ability to collect data from disparate sources and aggregate it in a virtual database where it can be used for business intelligence (BI) or other analysis. http://searchdatamanagement.techtarget.com/definition/data-federation-technology 
Enterprise Architecture (EA) An enterprise architecture (EA) is a conceptual blueprint that defines the structure and operation of an organization. The intent of an enterprise architecture is to determine how an organization can most effectively achieve its current and future objectives. http://searchcio.techtarget.com/definition/enterprise-architecture 
Health Information Technology Health IT (information technology) is the area of IT involving the design, development, creation, use and maintenance of information systems for the healthcare industry. Automated and interoperable healthcare information systems are expected to lower costs, improve efficiency and reduce error, while also providing better consumer care and service. http://searchhealthit.techtarget.com/definition/Health-IT-information-technology 
Internet of Things The Internet of Things (IoT) is a scenario in which objects, animals or people are provided with unique identifiers and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction. IoT has evolved from the convergence of wireless technologies, micro-electromechanical systems (MEMS) and the Internet. http://whatis.techtarget.com/definition/Internet-of-Things  
Mobile Application Management Mobile application management is the delivery and administration of enterprise software to end users’ corporate and personal smartphones and tablets. Unlike mobile device management software, which assists with device activation, enrollment and provisioning, MAM software assists with software delivery, software licensing, configuration, application life cycle management (ALM) and usage tracking. Many mobile application managers can also compare mobile device type and ownership to IT-defined policies and limit how corporate data can be shared among mobile apps. http://searchconsumerization.techtarget.com/definition/mobile-application-management 
Mobile Security Mobile security is the protection of smartphones, tablets, laptops and other portable computing devices, and the networks they connect to, from threats and vulnerabilities associated with wireless computing. Mobile security is also known as wireless security.  Securing mobile devices has become increasingly important in recent years as the numbers of the devices in operation and the uses to which they are put have expanded dramatically. The problem is compounded within the enterprise as the ongoing trend toward IT consumerization is resulting in more and more employee-owned devices connecting to the corporate network. http://whatis.techtarget.com/definition/mobile-security 
Predictive Analytics Predictive analytics is the branch of data mining concerned with the prediction of future probabilities and trends. The central element of predictive analytics is the predictor, a variable that can be measured for an individual or other entity to predict future behavior. For example, an insurance company is likely to take into account potential driving safety predictors such as age, gender, and driving record when issuing car insurance policies. http://searchcrm.techtarget.com/definition/predictive-analytics 
Robotics Robotics is a branch of engineering that involves the conception, design, manufacture, and operation of robot s. This field overlaps with electronics, computer science, artificial intelligence , mechatronics, nanotechnology , and bioengineering. http://whatis.techtarget.com/definition/robotics 
Speech Recognition Voice or speech recognition is the ability of a machine or program to receive and interpret dictation, or to understand and carry out spoken commands. http://searchcrm.techtarget.com/definition/voice-recognition 
Sustainability Risk Management Sustainability risk management (SRM) is a business strategy that aligns profit goals with a company's environmental policies.  The goal of SRM is to make this alignment efficient enough to sustain and grow a business while preserving the environment. One of the chief drivers for SRM adoption is increasing demand for compliance with global and national regulations.  http://searchcio.techtarget.com/definition/sustainability-risk-management-SRM 
Virtualization In IT, virtualisation is the process of creating logical computing resources from available physical resources. This is accomplished using virtualisation software to create a layer of abstraction between workloads and the underlying physical hardware. Once installed, the virtualised computing resources such as memory, CPUs, network and disk I/O and storage can all be pooled and provisioned to workloads without regard for physical location within a data center.  http://searchvirtualdatacentre.techtarget.co.uk/definition/Virtualisation 



Views: 1070


<p>The Health Information Technology category appears focused on personal health information and on Electronic Health Records. There are many other areas of HIT that could be supported through Alliant II including:</p><ul><li>population health needs of healthcare delivery such as identification of drug and device adverse event reports,</li><li>contributing to disease registries,</li><li>monitoring for disease outbreaks,</li><li>regulatory aspects of the healthcare industry such as IT systems to support clinical trials, pharmaceutical manufacturing and compliance, and</li><li>processing and analysis of submissions to regulatory agencies. &nbsp;</li></ul><p>Information Innovators Inc recommends that the language in the solicitation be broad enough to enable Agencies with these types of requirements to use Alliant II.&nbsp;</p>
<p><span style="display: none;">&nbsp;</span>SRA appreciates the opportunity to respond to your request for comments on the Alliant II relevant technologies.&nbsp; It is through collaborative processes like yours, where the entire community can participate and the government can assimilate the best of the best ideas, that procurements and the resulting proposals become more responsive to the customer&rsquo;s needs.&nbsp; We elected to use your previous four &ldquo;Emerging Technology&rdquo; questions as a logical framework to comment on your updated listing of &ldquo;Leading Edge Technologies.&rdquo;</p><ul><li><em>How critical do you think any of these trends will be over the next 10-15 years?</em></li></ul><p>All of these technology trends will meet critical needs but the demand won&rsquo;t be equal and will be dependent upon an organization&rsquo;s mission and functional domains.&nbsp;&nbsp;</p><p>The trends having broadest applicability and longevity across the entire 10-15 year timeframe are in areas of: Systems Development (Agile Development, Cloud Computing, Data Center as a Service (to include Software Defined Networks), Enterprise Architecture, Health IT, Internet of Things, Mobile Application Management, Speech Recognition, Sustainability Risk Management, and Virtualization); Data (Artificial Intelligence, Big Data, Data Federation, and Predictive Analysis); and Security (Biometrics, Cryptography, Cyber Security, and Mobile Security).</p><p>Brain Computer Interface and Robotics have a less broad applicability across information technology consumers but that&rsquo;s not to say that they aren&rsquo;t critical for meeting certain needs; e.g., assistance to individuals with disabilities. These two emerging technologies will not begin to reach operational maturity until 5-10 years from now.&nbsp;&nbsp;&nbsp;</p><ul><li><em>Which are your top three?</em></li></ul><p>The top three measured in terms of operational utility would be big data, cloud, and cyber security.</p><p>But that being said, business and mission solutions developed through the integration of big data, cloud, mobile, cybersecurity, and <em>social business media</em> will have an overwhelming importance over the next 5-10 years.&nbsp; Authoritative information technology sources (McKinsey, Gartner, Forrester, IDG, IDC, etc.) forecast that integration of these five technologies, often referred to as the Nexus of Forces, will account for the vast majority of IT growth through 2020.&nbsp; And we would add that Agile and DevOps development processes will be key to doing this development and integration effectively and efficiently&hellip; the first time.</p><ul><li><em>Are we missing emerging trends that you are tracking?</em></li></ul><p>In keeping with your current thinking to use a select list of technologies as evidence of relevant IT experience for making Alliant II awards, the move away from emerging to leading technologies is prudent given that many of the emerging technologies, while promising, are immature; 5-15 years away from providing proven, community-wide, operational benefit, and only exist in R&amp;D accounts. Typically many of today&rsquo;s emerging technologies will evaporate or be overcome by events.&nbsp;</p><p>We&rsquo;ve identified several leading but proven technologies based on our own experience and validated by the information technology sources mentioned previously.</p><ul><li>Computational Linguistics and Natural Language processing to mine, analyze, and correlate the wealth of information in unstructured data; comprises at least 80% of the data that exists today.&nbsp; As social media explodes this will be even more critical.</li><li>Social Business and Collaboration Tools and Processes to harness the capabilities and intellectual capacity of the entire enterprise to accelerate innovative, comprehensive, collaborative solutioning while increasing the mind-share of individuals through real-time content creation and sharing.</li><li>Organizational Change Management to ensure agencies can fully leverage the Web 2.0 and beyond working environments while both implementing new technologies and integrating millennials into the workforce.</li><li>DevOps to transform systems development into an Agile-like process that ensures continuity, efficiencies, and economies throughout the development lifecycle.&nbsp; On-time, within budget, and done right the first time.</li><li>Add to Cybersecurity&hellip; Automatic, hardware-based, zero-day malware prevention, detection, isolation, and analysis&hellip; not signature-based software.</li><li>Health IT -Cloud-based Access of Genome Datasets to support worldwide analysis and accelerated disease prevention and recovery treatments</li><li>Health IT-Master Data Management of medical records across disparate organizations in order to ensure security, privacy, and data integrity.</li></ul><p>Note: Quantum Computing is a special case and it is an emerging and not a leading technology. When and if it can be harnessed economically, it would become the most disruptive force in the field of information technology since the introduction of the microchip.&nbsp; There are significant investments being made in this technology based on its potential impact, although most of the research is conducted outside the public view.&nbsp; Depending on the pace of investments, the promises of quantum computing may begin to be fulfilled in the 2020&rsquo;s.</p><ul><li><em>Which of the above should we ignore for now?</em></li></ul><p>As alluded to earlier, we feel that the brain computer interface and robotics categories is still very much in the R&amp;D lifecycle and are not appropriate technologies for evaluating the suitability of Alliant II awardees.</p>
Welcome! Thank you for visiting the GSA Alliant 2 (A2) & Alliant 2 Small Business (A2SB) GWACs Community. The purpose of this site is to... More

To stay informed on the group's latest updates, subscribe here.

  • DNEAL-JTQ's picture
  • lqmartin's picture
  • Stormfield's picture