Join 8,500 technologists, regulators and users across commercial and defense sectors for AUVSI XPONENTIAL 2019, the largest, most comprehensive trade show for unmanned and autonomous systems. As adoption spreads and applications expand, XPONENTIAL is the one event that brings together the entire unmanned systems community to share ideas, collaborate across markets, capitalize on best practices and emerging trends and harness the power of unmanned technology for your business.

Connect the dots on cross-industry opportunities:

Connect the dots on cross-industry opportunities:

A POWERFUL EXPERIENCE

Check out all of the show features designed to shape your XPONENTIAL experience:

Charles River Analytics, Inc.  

Cambridge,  MA 
United States
https://www.cra.com/
  • Booth: 419


Building human-centered, resilient, autonomous platforms

Since 1983, Charles River Analytics has been a pioneer in the core technologies that are enabling the revolutionary capabilities in today’s autonomous systems. Working with a wide range of robotics platform providers, we apply advancements in AI, machine learning, computer vision, cognitive systems and human-machine teaming to systems across the land, sea, air and space domains, serving the needs of customers in the DoD, DHS, NASA, and the robotics community at large.


 Press Releases

  • The Future of Battlefield Medicine:

    Charles River Analytics Develops Autonomous AI for Military Medicine

    CAMBRIDGE, MA – Charles River Analytics Inc., developer of intelligent systems solutions, will pioneer how autonomous artificial intelligence can help save lives under new contracts for the US Army’s Telemedicine & Advanced Technology Research Center (TARTC).

    “To become more resilient to the challenges on the battlefield, Warfighters need better access to casualty care, even when they’re cut off from support or evac,” said Max Metzger, Senior Software Engineer at Charles River Analytics.

    The Army awarded Charles River three related contracts to design components for an Automated Ruggedized Combat Casualty Care (ARC3) system. The ARC3 system aims to provide life-saving techniques and strategies for trauma care on the battlefield, known as Tactical Combat Casualty Care (TCCC). Each contract focuses on a different component of the ARC3 system—monitoring, diagnostics, and intervention.

    “We’re excited to be working on life-saving trauma systems,” added Metzger. “Under ARC3, we’re building software modules to help medics monitor patients, diagnose injuries, and provide treatment in areas that are isolated or difficult to access.”

    The Army also awarded a contract enhancement for our EPIC3 app, which helps medics diagnose and treat traumatic injuries. The enhancement will be used to develop tools that let TCCC experts author protocols that can be understood by medical AI.

    “EPIC3 offers a simple interface tailored to the medic’s needs and skill level,” added Metzger. “It compiles diagnostic and treatment techniques from medical experts and the latest research, and then presents medical alerts and treatment guidance.”

    We recently presented EPIC3 at the U.S. Army Telemedicine & Advanced Technology Research Center (TATRC) open house.

    EPIC3 and ARC3 build on other techniques and apps Charles River has developed to support Warfighters and predict traumatic injury, such as the TMT tourniquet training system and the PROMPTER training tool for battlefield first-aid skills.

    See more of our Healthcare Support and Training efforts that address medical skills training, therapy and decision support tools, as well as sensor and sensing technologies. These efforts include our VITAMMINS medical simulation and tutoring system, and the STAT tablet-based training system, which presents a virtual patient in multiple trauma scenarios for efficient and effective learning, rehearsal, and assessment.

    Learn more about our capabilities at booth 419 at the Association for Unmanned Vehicle Systems International (AUVSI) XPONENTIAL 2019—the world’s largest tradeshow for unmanned and autonomous systems—from April 29-May 2 in Chicago, IL.

    For more information about these efforts or our other Healthcare Support and Training efforts, contact us.

    About Charles River Analytics: Since 1983, Charles River Analytics has been a pioneer in the core technologies that are enabling the revolutionary capabilities in today’s autonomous systems. Working with a wide range of robotics platform providers, we apply advancements in AI, machine learning, computer vision, cognitive systems and human-machine teaming to systems across the land, sea, air and space domains, serving the needs of customers in the DoD, DHS, NASA, and the robotics community at large.

    This work was supported by the US Army Medical Research and Materiel Command under Contract Nos. W81XWH-18-C-0008, W81XWH-18-C-0132, W81XWH-18-C-0330, and W81XWH-19-C-0029. The views, opinions and/or findings contained in this report are those of the author(s) and should not be construed as an official Department of the Army position, policy or decision unless so designated by other documentation. The appearance of Department of Defense imagery does not imply endorsement.

  • Robots to the Rescue:

    Charles River Analytics Develops Autonomous Robot Collaboration Toolkit to Keep Combat Medics Safe

    CAMBRIDGE, MA – Charles River Analytics Inc., developer of intelligent systems solutions, has received funding from the US Army Medical Research and Materiel Command to integrate proven sensing and autonomy capabilities into a modular hardware/software Body-Aware Robotic Appliqué for Collaborative Evacuation (BRACE). BRACE can save lives in CASEVAC operations by keeping medics out of harm’s way, and by quicker evacuation of casualties from active battlefields.

    “Unmanned systems can provide significant operational benefits for dangerous tasks, such as casualty extraction from active battlefield environments,” said Stan German, Senior Scientist at Charles River Analytics and Principal Investigator on the BRACE effort. “BRACE hardware and software enables plug-and-play integration with current and future unmanned systems, giving BRACE unprecedented adaptability.”

    BRACE uses state-of-the-art perception technologies to support effective manipulation, navigation, maneuvering, and obstacle avoidance in varied operating conditions. Our toolkit performs dynamic world modeling, 3D casualty perception, path planning, and localization onboard each vehicle. BRACE also shares relevant information with networked unmanned and manned teammates using a communications manager optimized for constrained-bandwidth networks.

    Under a related effort, we developed our Anthropometry and Pose Observation using Low-Dimensional Latent Optimization (APOLLO) toolkit, a key enabler for BRACE. With APOLLO, robots can visually perceive casualties in real time. Our toolkit identifies casualties in an image and models their position and body shape in 3D so that a team of small autonomous robots can remove them from harm.

    At Charles River Analytics, our deep understanding of applied robotics and autonomy is the result of extensive research, development, and deployment of solutions incorporating disciplines such as artificial intelligence, machine learning, cognitive science, and human factors. This experience uniquely positions us to provide mature and tailored solutions for our customers’ needs.

    Learn more about BRACE and APOLLO at the Association for Unmanned Vehicle Systems International (AUVSI) XPONENTIAL 2019—the world’s largest tradeshow for unmanned and autonomous systems—from April 29-May 2 in Chicago, IL. AUVSI represents corporations and professionals from more than 60 countries involved in industry, government, and academia.

    Contact us to learn more about BRACE, APOLLO, and our other Unmanned Systems capabilities.

    About Charles River Analytics: Since 1983, Charles River Analytics has been a pioneer in the core technologies that are enabling the revolutionary capabilities in today’s autonomous systems. Working with a wide range of robotics platform providers, we apply advancements in AI, machine learning, computer vision, cognitive systems and human-machine teaming to systems across the land, sea, air and space domains, serving the needs of customers in the DoD, DHS, NASA, and the robotics community at large.

    This effort was awarded in support of the US Army Medical Research and Materiel Command under Contract No. W81XWH-18-C-0079 and Contract No. W81XWH-19C-0025. The views, opinions and/or findings contained in this report are those of the author(s) and should not be construed as an official Department of the Army position, policy or decision.

    ###

  • The Future of Human-Machine Teams:

    Charles River Analytics Develops Appliqué to Naturally Direct Unmanned Robotic Vehicles

    CAMBRIDGE, MA – Charles River Analytics Inc., developer of intelligent systems solutions, has received funding through the US Army’s Combat Vehicle Robotics (CoVeR) program to enhance its Modular Appliqué Enabling Natural Teaming with Autonomy (MANTA) system. Consistent with the objective of the CoVeR program to develop technologies that support scalable integration of multi-domain robotic and autonomous systems, MANTA is a platform-independent, natural control and autonomy robot appliqué that enables a user to easily direct one or more host platforms to perform a range of autonomous behaviors.

    “With MANTA, our human-machine interface enables Commanders to issue instructions to one or more unmanned robotic systems using the same natural communication method that they use with human personnel,” said Camille Monnier, Principal Scientist at Charles River Analytics and Principal Investigator on the MANTA effort. “Our contributions in manned-unmanned teaming, or MUM-T, enable leap-ahead capabilities in mission-level autonomy.”

    Robotic systems equipped with the hardware/software appliqué can execute any behavior supported by platform capabilities, as well as share and access information with other vehicles equipped with the appliqué. Equipped platforms can be controlled using silent gesture-based commands, such as follow-me, stop, and relocate to a different position, or more complex speech-based commands, such as “monitor the back of the red building for activity,” “give me a close-up view of that white van,” and “emplace a remote sensing device in Zone 2.”

    Rich Wronski, Vice President of Charles River’s Sensing, Perception and Applied Robotics Division, commented, “MANTA is one of our autonomy components built on mission-focused artificial intelligence that can integrate with current and evolving platforms. By integrating our reliable, adaptable robotic subsystems with their existing platforms, our customers can achieve sought-after mission-level autonomy for single, tele-operated platforms as well as multi-platform, collaborative robotic teams that self-organize around high-level objectives and commands.”

    At Charles River, our deep understanding of applied robotics and autonomy is the result of extensive research, development, and deployment of solutions incorporating disciplines such as artificial intelligence (AI), machine learning, cognitive science, and human factors. This experience uniquely positions us to provide mature and tailored solutions for our customers’ needs.

    Learn more about MANTA at the Association for Unmanned Vehicle Systems International (AUVSI) XPONENTIAL 2019—the world’s largest tradeshow for unmanned and autonomous systems—from April 29-May 2 in Chicago, IL. AUVSI represents corporations and professionals from more than 60 countries involved in industry, government, and academia. Visit Charles River Analytics at booth 419.

    Contact us to learn more about MANTA and our other Unmanned Systems capabilities.

    About Charles River Analytics: Since 1983, Charles River Analytics has been a pioneer in the core technologies that are enabling the revolutionary capabilities in today’s autonomous systems. Working with a wide range of robotics platform providers, we apply advancements in AI, machine learning, computer vision, cognitive systems and human-machine teaming to systems across the land, sea, air and space domains, serving the needs of customers in the DoD, DHS, NASA, and the robotics community at large.

    This effort was sponsored by the U.S. Government under Other Transaction number W15QKN-17-9-1025 with the National Advanced Mobility Consortium. The U.S. Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation herein. The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of the U.S. Government.
    DISTRIBUTION STATEMENT A.   Approved for public release; distribution unlimited.

    ###

  • Welcome, AUVSI XPONENTIAL attendees!

    Visit Charles River Analytics at Booth #419!

    Charles River Analytics Inc., developer of intelligent systems solutions, will showcase our developments in applied robotics and artificial intelligence (AI) at the Association for Unmanned Vehicle Systems International (AUVSI) XPONENTIAL 2019—the world’s largest tradeshow for unmanned and autonomous systems. Held in Chicago, IL, from 29 April—2 May, AUVSI represents corporations and professionals from more than 60 countries involved in industry, government, and academia.

    Is Your Platform Autonomous and Resilient?

    Building autonomous platforms that are resilient when faced with emerging complexity during a mission is a challenging endeavor. Designing autonomy that seamlessly integrates with your organization’s human processes and teammates is even more difficult.

    We offer a wide range of robotic subsystems and autonomy components built on mission-focused AI. These components are designed to integrate easily with current and evolving platforms.

    By integrating our reliable, adaptable robotic subsystems with existing solutions, our customers and their end users can achieve mission-level autonomy for single and multi-platform systems.

    Our deep understanding of applied robotics and autonomous systems is the result of extensive research, development, and deployment across many disciplines, such as AI, machine learning, cognitive science, and human factors. This understanding uniquely positions us to provide mature solutions tailored to our customers’ needs.

    Sensing and Perception

    Systems augmented with our sensing and perception components gain an integrated sense of vision, hearing, and touch, along with reasoning services that enable the system to perceive operational situations.

    Manned-Unmanned Teaming (MUM-T)

    Robots can receive and follow high-level commands instead of requiring step-by-step direction via teleoperation.  Our contributions enable leap-ahead capabilities, giving mission-level autonomy to individual robots and collaborative robotic teams.

    HMI/Supervisory Control

    We fuse speech, gestures, and traditional robot control technologies so unmanned vehicles can be managed using a smart device.  Our supervisory human-machine interfaces seamlessly integrate swarms into human teams.

    Swarms and Robot-to-Robot Collaboration

    Autonomous robots can perform risky tasks, such as casualty evaluation (CASEVAC), scouting, and explosive device countermeasures, to help keep your personnel safe.  Our components help robotic swarms collaborate using biomimetic algorithms and deep machine learning to get the job done.

    Autonomy Platforms/Infrastructure

    Our IOP-compliant appliqués bring new capabilities to existing robotic platforms, extending the useful life of these systems.  Because our solutions are developed on open architectures, we can customize both software and hardware to meet the challenges of niche applications.

    Learn More About How We Translate New AI Ideas into Actionable Solutions at Booth #419!

  • Charles River Analytics Inc., developer of intelligent systems solutions, is honored to be named one of Boston Business Journal’s 2019 Best Places to Work. We earned this distinction based on our benefits and top scores on employee satisfaction surveys.

    “The 2019 Best Places to Work recognition is a huge accomplishment for us,” said Jess DeCristoforo, Director of Human Resources at Charles River Analytics. “To be honored for the second time for this prestigious award shows the power of our employee ownership! The Boston Business Journal bases this distinction on our comprehensive benefits package and employee surveys, and we are extremely proud that our diverse staff excels in our workplace.”

    At Charles River Analytics, we proudly deliver state-of-art AI components and other systems solutions through our interdisciplinary research, which is deeply rooted in innovation and creativity. The drive to achieve breakthrough, cutting-edge technology creates a connected, vibrant workplace culture where employees can reach their full potential.

    Learn more about Charles River Analytics at www.cra.com.


 Products

  • On-Water Testing & Operations Services
    Our on-water testing and operations services include transport, assembly, deployment, operation, evaluation, repair, and maintenance of marine systems....

  • On-Water Testing & Operations Services

    The complexity of on-water testing often means prohibitively high costs and challenging schedule constraints.

    Charles River’s Marine Systems Test Facility is located on Block Island Sound in Wakefield, Rhode Island. It provides immediate year-round access to Navy-approved littoral testing waters in the Point Judith Harbor of Refuge, Block Island Sound, and Narragansett Bay. Our on-water testing and operations services include transport, assembly, deployment, operation, evaluation, repair, and maintenance of marine systems including:

    • Unmanned underwater vehicles (4 feet to 40 feet)

    • Underwater and ruggedized sensors

    • Underwater moored systems

    • Surface platforms and vessels

    • Custom and complex underwater devices and enclosures

    • Topside sensors, cameras, and navigation equipment


    Testing is led by the same staff who design and develop Navy marine systems, and is conducted with vessels captained by licensed mariners. The testing facility includes a design office, meeting space, and high-bay warehouse for fabrication, assembly, and secure storage, and is on the grounds of Point View Marina, which provides dockage for vessels up to 100 feet in length and hauling for vessels up to 70 tons, with 24 feet of beam.

    With our proximity to testing grounds, access to resources, and trained engineering and maritime staff, we aim to be the fastest and easiest solution for proving your marine systems in the water.

    Engineering & Prototype Development

    Charles River’s Hardware Engineering and Prototype Team includes system, software, electrical, mechanical,and ocean engineers who specialize in research and development of hardware prototypes for intelligent systems. We have diverse experience developing hardware systems for the DoD including:

    • Unmanned vehicle communication, navigation, and control systems

    • Ruggedized and waterproof electronic systems

    • Depth-rated and pressure-tolerant housings

    • Underwater vehicle electrical and mechanical systems

    • UUV launch and recovery vessels and devices

    • Sensor and scientific testing rigs

    In addition to a full engineering staff, we have inhouse technicians and tooling to fabricate, wire, build, and integrate prototypes for rapid development and testing cycles.

    We aim to be the most qualified and flexible team to develop hardware prototypes for your intelligent systems.

    For more information, contact:

    Mr. Jeff Prisco, Marine Systems Test Facility Manager
    jprisco@cra.com | 617.491.3474 x586

    Mr. Jay Everson, Marine Operations Lead
    jeverson@cra.com | 617.491.3474 x740

    Mr. Ross Eaton, Senior Scientist
    reaton@cra.com | 617.491.3474 x648

  • Autonomous Marine Meteorological Station
    SWIMS aims to dramatically improve the accuracy, frequency, and latency of atmospheric data collection in the offshore air-sea interface....

  • Smart Weather Instrument (SWIMS)

    Autonomous marine meteorological station

    SWIMS aims to dramatically improve the accuracy, frequency, and latency of atmospheric data collection in the offshore air-sea interface and reduce the costs typically associated with disposable, human-deployed weather measurement systems.

    PROBLEM

    For the Navy to maintain operational superiority and adapt to the dynamic conditions of the warfighting environment, accurate weather forecasting is a necessity. The air-sea interface—where air, moisture, and heat mix in the lowest kilometers of the atmosphere—drives many weather phenomena and is of critical concern to forecasters, but is typically difficult and expensive to monitor, especially in remote parts of the ocean.

    SOLUTION

    SWIMS is an autonomous marine meteorological station, which operates at sea to intelligently and persistently measure environmental parameters from the sea surface up to 1 kilometer in the atmosphere. The SWIMS platform mounts to a manned or unmanned surface vessel, where it remains at sea for months at a time, raising and lowering atmospheric sensors attached to a tethered blimp. Atmospheric data is streamed back to land-based weather stations to immediately update weather models and fed back to the base station, where it is processed by onboard autonomy to inform future measurements.

    For More Information

    Dr. Arjuna Balasuriya, Senior Scientist
    abalasuriya@cra.com
    617-491-3474 x778

    This material is based upon work supported by the US Navy, Office of Naval Research, under Contract No. N68335-18-C-0173. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the US Navy.

  • AutoTRap™
    The Onboard Automated Target Recognition App developed by Charles River Analytics...

  • Artificial Intelligence (AI) for Underwater Operations

    Given how quickly technology and the operational domain co-evolve, onboard AI software for underwater platforms can be a risky element of your platform supply chain. AI software must perform as expected, even when it must support new sensor types and models or changing mission environments and goals.

    Charles River Analytics offers an onboard automated target recognition (ATR) software package, AutoTRap™, which processes leading sonar sensor data in real-time, and can be trained and updated for a variety of target profiles.

    Your platform can use AutoTRap’s target library to support real-time awareness for autonomous behavior, in‑mission operations, or command and control. No additional engineering is required to expand AutoTRap’s target library because it updates itself from the training data you provide.

    Charles River Analytics develops best-in-class software—we combine agile innovation with a track record of hardened engineering in austere environments. Our ATR software is developed, integrated, and deployed on market leading AUV platforms.

    With AutoTRap implemented onboard your unmanned underwater vehicles (UUVs), operators can search and automatically detect targets of interest in real-time. AutoTRap has two main components: the Target Detector, and the Target Library. Novel machine learning algorithms train the detector to locate targets based on the profiles stored in the library.

    Onboard ATR capabilities on UUVs offer significant advantages in underwater operations: Reduced launch and recovery times result from target detections sent to command and control stations in real-time while the UUV is submerged. Costs are reduced and operation times are optimized, enabling coverage of a wider search area.


    More accurate target classification can also occur when operators re-direct UUVs to further examine targets based on real-time detections.

    Minimum System Requirements:
    • Operating System: Windows® 7/10, Linux®
    • Processor: Intel® Core™ i5
    • Memory: 4 GB of RAM
    • Storage: 10 GB of Free Space
    • Teledyne CARIS™ Onboard: Version 2.0
    • Teledyne CARIS™ Configuration: Version 5.0

    Data Requirements
    • Side scan data of the target of interest from at least 8 directions

    Contact
    Richard Wronski
    rwronski@cra.com
    617.491.3474 x568
     

  • Follow-Me
    A human-machine interface for unmanned ground vehicles...

  • Follow-Me

    With Follow-Me, unmanned ground vehicles (UGVs) become true support agents on human-robot teams

    “The first easy-to-use and easy-to-understand autonomous ground systems that can earn the trust of human operators will mark a turning point in the adoption and deployment of UGVs as true support agents within human-robot teams.” - Camille Monnier, Principal Scientist

    The Follow-Me human-machine interface lets a UGV accompany a human operator, like a fellow squad member, through challenging outdoor environments. It responds to speech and gesture commands, providing verbal and non-verbal feedback, maintaining formation, and avoiding obstacles along the way.

    The interface works with onboard autonomy software so the UGV can act both as a scout and robotic mule, carrying cumbersome equipment and eliminating the need for remote control piloting. The system also provides speech-based natural language processing, enabling hands-free control and feedback.

    Features

    • Hands-Free Following – A mobile robot can follow a leader on its own, without the need for a remote control.
    • Gesture-Based Controls – The robot can be controlled using natural hand gestures, such as raising or lowering a hand, or pointing in the direction the robot should go.
    • Voice Commands – The robot can be controlled using simple verbal commands, such as follow me, back off, come closer, or drive two meters to your 6 o’clock.
    • Obstacle Avoidance – The robot automatically avoids obstacles and hazards during navigation.
    • Easy Integration – Follow-Me integrates with existing commercial, off-the-shelf computing and sensing hardware or as a standalone hardware product. It also integrates with ROS-enabled platforms.

    For More Information

    For questions on Follow-Me or our related capabilities, visit Charles River Analytics at Booth 419!

  • MANTA
    Modular Appliqué Enabling Natural Teaming with Autonomy (MANTA) system...

  • MANTA is a platform-independent, natural control and autonomy robot appliqué that enables a user to easily direct one or more host platforms to perform a range of autonomous behaviors.

    “With MANTA, our human-machine interface enables Commanders to issue instructions to one or more unmanned robotic systems using the same natural communication method that they use with human personnel,” said Camille Monnier, Principal Scientist at Charles River Analytics. “Our contributions in manned-unmanned teaming, or MUM-T, enable leap-ahead capabilities in mission-level autonomy.”

    Robotic systems equipped with the hardware/software appliqué can execute any behavior supported by platform capabilities, as well as share and access information with other vehicles equipped with the appliqué. Equipped platforms can be controlled using silent gesture-based commands, such as follow-me, stop, and relocate to a different position, or more complex speech-based commands, such as “monitor the back of the red building for activity,” “give me a close-up view of that white van,” and “emplace a remote sensing device in Zone 2.”

    Rich Wronski, Vice President of Charles River’s Sensing, Perception and Applied Robotics Division, commented, “MANTA is one of our autonomy components built on mission-focused artificial intelligence that can integrate with current and evolving platforms. By integrating our reliable, adaptable robotic subsystems with their existing platforms, our customers can achieve sought-after mission-level autonomy for single, tele-operated platforms as well as multi-platform, collaborative robotic teams that self-organize around high-level objectives and commands.”

    At Charles River, our deep understanding of applied robotics and autonomy is the result of extensive research, development, and deployment of solutions incorporating disciplines such as artificial intelligence (AI), machine learning, cognitive science, and human factors. This experience uniquely positions us to provide mature and tailored solutions for our customers’ needs.

    Talk to us about MANTA at Booth 419!


Send Email

Type your information and click "Send Email" to send an email to this exhibitor. To return to the previous screen without saving, click "Reset".