UTIAS / en ¯r¶¹Íø team takes top spot in self-driving car challenge for 6th time in 7 years /news/u-t-team-takes-top-spot-self-driving-car-challenge-6th-time-7-years <span class="field field--name-title field--type-string field--label-hidden">¯r¶¹Íø team takes top spot in self-driving car challenge for 6th time in 7 years</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2024-08/AUTODRIVE_24_5601-crop.jpg?h=3a919dd0&amp;itok=kFCXUnGZ 370w, /sites/default/files/styles/news_banner_740/public/2024-08/AUTODRIVE_24_5601-crop.jpg?h=3a919dd0&amp;itok=YPAb6B8H 740w, /sites/default/files/styles/news_banner_1110/public/2024-08/AUTODRIVE_24_5601-crop.jpg?h=3a919dd0&amp;itok=Q35dvO8b 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2024-08/AUTODRIVE_24_5601-crop.jpg?h=3a919dd0&amp;itok=kFCXUnGZ" alt="UofT's self driving car avoids a mock moose crossing the road"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2024-08-07T13:42:55-04:00" title="Wednesday, August 7, 2024 - 13:42" class="datetime">Wed, 08/07/2024 - 13:42</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p><em>As part of the competition, the ¯r¶¹Íø team’s autonomous vehicle had to react to obstacles such as a fake deer moving across the road (photo courtesy of aUToronto)</em></p> </div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/alumni" hreflang="en">Alumni</a></div> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/self-driving-cars" hreflang="en">Self-Driving Cars</a></div> <div class="field__item"><a href="/news/tags/undergraduate-students" hreflang="en">Undergraduate Students</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">"Each time we saw an obstacle – a stop sign, a red light, the railroad bar coming down – and the car reacted by stopping and then continuing, we let out a big cheer or a sigh of relief"</div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>A team from the ¯r¶¹Íø has placed first for sixth time in seven years in a North American self-driving car competition.&nbsp;</p> <p>After finishing in second place last year, <a href="https://www.autodrive.utoronto.ca">the aUToronto team</a> returned to the top spot at&nbsp;the <a href="https://www.autodrivechallenge.com" target="_blank">2024 SAE AutoDrive Challenge II</a>, which was held in June at the Mcity Test Facility in Ann Arbor, Mich.</p> <p>The aUToronto team competed against nine other teams from across Canada and the United States.</p> <p>“Through the AutoDrive Challenge, we are preparing the next generation of engineers to head into the industry, to keep pushing towards the challenging goal of reaching Level 4 autonomous driving,†says&nbsp;<strong>Tim Barfoot</strong>, a professor at the ¯r¶¹Íø Institute for Aerospace Studies (UTIAS) in the Faculty of Applied Science &amp; Engineering and one of the team’s academic advisers.&nbsp;&nbsp;&nbsp;</p> <p>“The team did another excellent job this year.â€&nbsp;&nbsp;</p> <p>The team approached the competition by going back to first principles to ensure they had a reliable and robust system, says&nbsp;<strong>Kelvin Cui</strong>, a ¯r¶¹Íø Engineering alumnus and&nbsp;the team’s principal.&nbsp;&nbsp;</p> <p>He joined aUToronto last fall after five years with the ¯r¶¹Íø Formula Racing team, where he founded the “driverless†division.&nbsp;&nbsp;&nbsp;</p> <p>“We looked at what was going to get us the most points at competition and made sure that we were not overbuilding our system and adding too much complexity,†he says.&nbsp;&nbsp;</p> <p>This meant pushing for additional testing time at UTIAS and achieving more than 900 kilometres of system testing prior to the competition.&nbsp;&nbsp;&nbsp;</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-08/AUTODRIVE_24_5334-crop.jpg?itok=xSJviMQl" width="750" height="500" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>The team placed first out of 10 teams from institutions across the United States and Canada (photo courtesy of aUToronto)</em></figcaption> </figure> <p>A partnership with the AutoDrive team from Queen’s University was instrumental to aUToronto’s preparation. The aUToronto team drove Artemis, their autonomous vehicle, to Kingston, Ont. to assess the system at Queen’s testing facility, which features intersections and electronic streetlights.&nbsp;&nbsp;</p> <p>“We added radar to our vehicle as a new sensor, so we needed to be aware of all the sensor failure modes,†says third-year Engineering Science student <strong>Robert Ren</strong>.&nbsp;&nbsp;</p> <p>“A lot of our testing time went into making sure that including radar didn’t break anything else in our system, and that it could handle any sensor failure cases.â€&nbsp;&nbsp;</p> <p>Including radar sensors in the vehicle’s perception system&nbsp;allowed it to measure the motion of objects directly, which is not possible with light detection and ranging (LiDAR) sensors. &nbsp;&nbsp;</p> <p>“Radar can help with adverse weather object detections,†adds Ren.&nbsp;“So, if the vehicle is operating under heavy rain or fog, the LiDAR is going to be limited, but the radio waves from radar can help the vehicle see what objects are in front and what objects are moving. This enables it to make good decisions when driving in uncertain scenarios.â€â€¯&nbsp;&nbsp;</p> <blockquote class="instagram-media" data-instgrm-captioned data-instgrm-permalink="https://www.instagram.com/p/C9ycZUeNM64/?utm_source=ig_embed&amp;utm_campaign=loading" data-instgrm-version="14" style=" background:#FFF; border:0; border-radius:3px; box-shadow:0 0 1px 0 rgba(0,0,0,0.5),0 1px 10px 0 rgba(0,0,0,0.15); margin: 1px; max-width:540px; min-width:326px; padding:0; width:99.375%; width:-webkit-calc(100% - 2px); width:calc(100% - 2px);"> <div style="padding:16px;"> <div style=" display: flex; flex-direction: row; align-items: center;"> <div style="background-color: #F4F4F4; border-radius: 50%; flex-grow: 0; height: 40px; margin-right: 14px; width: 40px;">&nbsp;</div> <div style="display: flex; flex-direction: column; flex-grow: 1; justify-content: center;"> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; margin-bottom: 6px; width: 100px;">&nbsp;</div> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; width: 60px;">&nbsp;</div> </div> </div> <div style="padding: 19% 0;">&nbsp;</div> <div style="display:block; height:50px; margin:0 auto 12px; width:50px;"><a href="https://www.instagram.com/p/C9ycZUeNM64/?utm_source=ig_embed&amp;utm_campaign=loading" style=" background:#FFFFFF; line-height:0; padding:0 0; text-align:center; text-decoration:none; width:100%;" target="_blank"><svg height="50px" version="1.1" viewBox="0 0 60 60" width="50px" xmlns:xlink="https://www.w3.org/1999/xlink"><g fill="none" fill-rule="evenodd" stroke="none" stroke-width="1"><g fill="#000000" transform="translate(-511.000000, -20.000000)"><g><path d="M556.869,30.41 C554.814,30.41 553.148,32.076 553.148,34.131 C553.148,36.186 554.814,37.852 556.869,37.852 C558.924,37.852 560.59,36.186 560.59,34.131 C560.59,32.076 558.924,30.41 556.869,30.41 M541,60.657 C535.114,60.657 530.342,55.887 530.342,50 C530.342,44.114 535.114,39.342 541,39.342 C546.887,39.342 551.658,44.114 551.658,50 C551.658,55.887 546.887,60.657 541,60.657 M541,33.886 C532.1,33.886 524.886,41.1 524.886,50 C524.886,58.899 532.1,66.113 541,66.113 C549.9,66.113 557.115,58.899 557.115,50 C557.115,41.1 549.9,33.886 541,33.886 M565.378,62.101 C565.244,65.022 564.756,66.606 564.346,67.663 C563.803,69.06 563.154,70.057 562.106,71.106 C561.058,72.155 560.06,72.803 558.662,73.347 C557.607,73.757 556.021,74.244 553.102,74.378 C549.944,74.521 548.997,74.552 541,74.552 C533.003,74.552 532.056,74.521 528.898,74.378 C525.979,74.244 524.393,73.757 523.338,73.347 C521.94,72.803 520.942,72.155 519.894,71.106 C518.846,70.057 518.197,69.06 517.654,67.663 C517.244,66.606 516.755,65.022 516.623,62.101 C516.479,58.943 516.448,57.996 516.448,50 C516.448,42.003 516.479,41.056 516.623,37.899 C516.755,34.978 517.244,33.391 517.654,32.338 C518.197,30.938 518.846,29.942 519.894,28.894 C520.942,27.846 521.94,27.196 523.338,26.654 C524.393,26.244 525.979,25.756 528.898,25.623 C532.057,25.479 533.004,25.448 541,25.448 C548.997,25.448 549.943,25.479 553.102,25.623 C556.021,25.756 557.607,26.244 558.662,26.654 C560.06,27.196 561.058,27.846 562.106,28.894 C563.154,29.942 563.803,30.938 564.346,32.338 C564.756,33.391 565.244,34.978 565.378,37.899 C565.522,41.056 565.552,42.003 565.552,50 C565.552,57.996 565.522,58.943 565.378,62.101 M570.82,37.631 C570.674,34.438 570.167,32.258 569.425,30.349 C568.659,28.377 567.633,26.702 565.965,25.035 C564.297,23.368 562.623,22.342 560.652,21.575 C558.743,20.834 556.562,20.326 553.369,20.18 C550.169,20.033 549.148,20 541,20 C532.853,20 531.831,20.033 528.631,20.18 C525.438,20.326 523.257,20.834 521.349,21.575 C519.376,22.342 517.703,23.368 516.035,25.035 C514.368,26.702 513.342,28.377 512.574,30.349 C511.834,32.258 511.326,34.438 511.181,37.631 C511.035,40.831 511,41.851 511,50 C511,58.147 511.035,59.17 511.181,62.369 C511.326,65.562 511.834,67.743 512.574,69.651 C513.342,71.625 514.368,73.296 516.035,74.965 C517.703,76.634 519.376,77.658 521.349,78.425 C523.257,79.167 525.438,79.673 528.631,79.82 C531.831,79.965 532.853,80.001 541,80.001 C549.148,80.001 550.169,79.965 553.369,79.82 C556.562,79.673 558.743,79.167 560.652,78.425 C562.623,77.658 564.297,76.634 565.965,74.965 C567.633,73.296 568.659,71.625 569.425,69.651 C570.167,67.743 570.674,65.562 570.82,62.369 C570.966,59.17 571,58.147 571,50 C571,41.851 570.966,40.831 570.82,37.631" /></g></g></g></svg></a></div> <div style="padding-top: 8px;"> <div style=" color:#3897f0; font-family:Arial,sans-serif; font-size:14px; font-style:normal; font-weight:550; line-height:18px;"><a href="https://www.instagram.com/p/C9ycZUeNM64/?utm_source=ig_embed&amp;utm_campaign=loading" style=" background:#FFFFFF; line-height:0; padding:0 0; text-align:center; text-decoration:none; width:100%;" target="_blank">View this post on Instagram</a></div> </div> <div style="padding: 12.5% 0;">&nbsp;</div> <div style="display: flex; flex-direction: row; margin-bottom: 14px; align-items: center;"> <div> <div style="background-color: #F4F4F4; border-radius: 50%; height: 12.5px; width: 12.5px; transform: translateX(0px) translateY(7px);">&nbsp;</div> <div style="background-color: #F4F4F4; height: 12.5px; transform: rotate(-45deg) translateX(3px) translateY(1px); width: 12.5px; flex-grow: 0; margin-right: 14px; margin-left: 2px;">&nbsp;</div> <div style="background-color: #F4F4F4; border-radius: 50%; height: 12.5px; width: 12.5px; transform: translateX(9px) translateY(-18px);">&nbsp;</div> </div> <div style="margin-left: 8px;"> <div style=" background-color: #F4F4F4; border-radius: 50%; flex-grow: 0; height: 20px; width: 20px;">&nbsp;</div> <div style=" width: 0; height: 0; border-top: 2px solid transparent; border-left: 6px solid #f4f4f4; border-bottom: 2px solid transparent; transform: translateX(16px) translateY(-4px) rotate(30deg)">&nbsp;</div> </div> <div style="margin-left: auto;"> <div style=" width: 0px; border-top: 8px solid #F4F4F4; border-right: 8px solid transparent; transform: translateY(16px);">&nbsp;</div> <div style=" background-color: #F4F4F4; flex-grow: 0; height: 12px; width: 16px; transform: translateY(-4px);">&nbsp;</div> <div style=" width: 0; height: 0; border-top: 8px solid #F4F4F4; border-left: 8px solid transparent; transform: translateY(-4px) translateX(8px);">&nbsp;</div> </div> </div> <div style="display: flex; flex-direction: column; flex-grow: 1; justify-content: center; margin-bottom: 24px;"> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; margin-bottom: 6px; width: 224px;">&nbsp;</div> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; width: 144px;">&nbsp;</div> </div> <p style=" color:#c9c8cd; font-family:Arial,sans-serif; font-size:14px; line-height:17px; margin-bottom:0; margin-top:8px; overflow:hidden; padding:8px 0 7px; text-align:center; text-overflow:ellipsis; white-space:nowrap;"><a href="https://www.instagram.com/p/C9ycZUeNM64/?utm_source=ig_embed&amp;utm_campaign=loading" style=" color:#c9c8cd; font-family:Arial,sans-serif; font-size:14px; font-style:normal; font-weight:normal; line-height:17px; text-decoration:none;" target="_blank">A post shared by aUToronto (@autoronto_uoft)</a></p> </div> </blockquote> <script async src="//www.instagram.com/embed.js"></script> <p>In an event where both LiDAR and radar sensors fail, the aUToronto system can still rely on visual cameras to perform object tracking. This made the team’s object tracker much more robust compared to last year when the team&nbsp;experienced sensor failure during a dynamic event.&nbsp;</p> <p><strong>Brian Cheong</strong>, a ¯r¶¹Íø Engineering master’s student who has been a member of aUToronto since 2021, acted as technical director of the autonomy team this year –&nbsp;part of a new leadership structure introduced by Cui.&nbsp; &nbsp;</p> <p>“In the past, it was a lot of work for our team’s principal to keep track of all the systems,†Cheong says.&nbsp;“So instead of having to work directly with all 15 sub teams, Kelvin created groups of sub teams that we called stacks, and each stack had a director.â€&nbsp;&nbsp;</p> <p>The restructuring and technical innovations paid off, with aUToronto completing its first clean sweep in the AutoDrive Challenge II, placing first in all static and&nbsp;dynamic events, including the concept design presentation and intersection challenge.&nbsp;&nbsp;</p> <p>“The intersection challenge was a big highlight for us,†says Cheong. “Kelvin and Robert were in the car, and I was on the sidelines watching with the rest of the team.&nbsp;Each time we saw an obstacle – a stop sign, a red light, the railroad bar coming down – and the car reacted by stopping and then continuing, we let out a big cheer or a sigh of relief.&nbsp;&nbsp;&nbsp;</p> <p>“And then we were all silent as the car approached the final obstacle, which was a deer. We watched as Artemis slowed down to a stop and the deer moved by. Then we screamed and cheered, and we could hear cheering from inside the car.â€&nbsp;&nbsp;</p> <p>“Our success is entirely a team effort,†adds Cui. “It was not smooth sailing before the competition. The only reason we won is because everybody put in so much effort to test our vehicle every day.</p> <p>“That’s how we were able to get this reliable system across the line.â€â€¯&nbsp;&nbsp;</p> <div> <div class="field field--name-field-media-oembed-video field--type-string field--label-hidden field__item"><iframe src="/media/oembed?url=https%3A//youtu.be/gG7DG-t2aiQ%3Fsi%3DkYGqZF0-x-6a4MBn&amp;max_width=0&amp;max_height=0&amp;hash=6whKFK-X5NSAGZdfMqSydpcgBMCmEPw2x-2wTgtl2jw" width="200" height="113" class="media-oembed-content" loading="eager" title="AutoDrive Challenge II Year 3 Highlight Video"></iframe> </div> </div> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Wed, 07 Aug 2024 17:42:55 +0000 Christopher.Sorensen 308926 at Start@UTIAS /node/308585 <span class="field field--name-title field--type-string field--label-hidden">Start@UTIAS</span> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>laurie.bulchak</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2024-07-25T21:33:52-04:00" title="Thursday, July 25, 2024 - 21:33" class="datetime">Thu, 07/25/2024 - 21:33</time> </span> <div class="field field--name-field-url field--type-string field--label-above"> <div class="field__label">URL</div> <div class="field__item">https://www.utias.utoronto.ca/startutias-entrepreneurship-program/</div> </div> <div class="field field--name-field-tags field--type-entity-reference field--label-above clearfix"> <h3 class="field__label">Tags</h3> <ul class="links field__items"> <li><a href="/news/tags/institute-aerospace-studies" hreflang="en">Institute for Aerospace Studies</a></li> <li><a href="/news/tags/utias" hreflang="en">UTIAS</a></li> </ul> </div> <div class="field field--name-field-campus field--type-entity-reference field--label-above"> <div class="field__label">Campus</div> <div class="field__item"><a href="/taxonomy/term/7034" hreflang="en">Off Campus</a></div> </div> Fri, 26 Jul 2024 01:33:52 +0000 laurie.bulchak 308585 at ¯r¶¹Íø researchers enhance object-tracking abilities of self-driving cars /news/u-t-researchers-enhance-object-tracking-abilities-self-driving-cars <span class="field field--name-title field--type-string field--label-hidden">¯r¶¹Íø researchers enhance object-tracking abilities of self-driving cars</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2024-05/PXL_20230608_181335793-crop.jpg?h=7575563c&amp;itok=mDJZAkzx 370w, /sites/default/files/styles/news_banner_740/public/2024-05/PXL_20230608_181335793-crop.jpg?h=7575563c&amp;itok=VS33Oojz 740w, /sites/default/files/styles/news_banner_1110/public/2024-05/PXL_20230608_181335793-crop.jpg?h=7575563c&amp;itok=lwAIt_Pp 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2024-05/PXL_20230608_181335793-crop.jpg?h=7575563c&amp;itok=mDJZAkzx" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>rahul.kalvapalle</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2024-05-29T10:59:42-04:00" title="Wednesday, May 29, 2024 - 10:59" class="datetime">Wed, 05/29/2024 - 10:59</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p><em>Sandro Papais, a PhD student, is the co-author of a new paper that introduces a graph-based optimization method to improve object tracking for self-driving cars&nbsp;(photo courtesy of aUToronto)</em></p> </div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/breaking-research" hreflang="en">Breaking Research</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/self-driving-cars" hreflang="en">Self-Driving Cars</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">The new tools could help robotic systems of autonomous vehicles better track the position and motion of vehicles, pedestrians and cyclists<br> </div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Researchers at the ¯r¶¹Íø Institute for Aerospace Studies (UTIAS) have introduced a pair of high-tech tools that could improve the safety and reliability of autonomous vehicles by enhancing the reasoning ability of their robotic systems.</p> <p>The innovations address multi-object tracking, a process used by robotic systems to track the position and motion of objects – including vehicles, pedestrians and cyclists – to plan the path of self-driving cars in densely populated areas.</p> <p>Tracking information is collected from computer vision sensors (2D camera images and 3D LIDAR scans) and filtered at each time stamp, 10 times a second, to predict the future movement of moving objects.&nbsp;&nbsp;</p> <p>“Once processed, it allows the robot to develop some reasoning about its environment. For example, there is a human&nbsp;crossing the street at the intersection, or a cyclist changing lanes up ahead,†says&nbsp;<strong>Sandro Papais</strong>, a PhD student in UTIAS in the Faculty of Applied Science &amp; Engineering. "At each time stamp, the robot’s software tries to link the current detections with objects it saw in the past, but it can only go back so far in time.â€&nbsp;</p> <p><a href="https://arxiv.org/pdf/2402.17892">In a new paper</a> presented at the 2024 International Conference on Robotics and Automation in Yokohama, Japan, Papais and co-authors <strong>Robert Ren</strong>, a third-year engineering science student, and Professor <strong>Steven Waslander</strong>, director of UTIAS’s <a href="https://www.trailab.utias.utoronto.ca/">Toronto Robotics and AI Laboratory</a>, introduce Sliding Window Tracker (SWTrack) – a graph-based optimization method that uses additional temporal information to prevent missed objects.</p> <p>The tool is designed to improve the performance of tracking methods, particularly when objects are occluded from the robot’s point of view.&nbsp;</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-05/Objects%20and%20Labels.jpg?itok=mTZFj1NL" width="750" height="426" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>A visualization of a nuScenes dataset used by the researchers. The image is a mosaic of the six different camera views around the car with the object bounding boxes rendered overtop of the images (image courtesy of the Toronto Robotics and AI Laboratory)</em></figcaption> </figure> <p>&nbsp;</p> <p>“SWTrack widens how far into the past a robot considers when planning,†says Papais. “So instead of being limited by what it just saw one frame ago and what is happening now, it can look over the past five seconds and then try to reason through all the different things it has seen.â€â€¯&nbsp;&nbsp;</p> <p>The team tested, trained and validated their algorithm on field data obtained through nuScenes, a public, large-scale dataset for autonomous driving vehicles that have operated on roads in cities around the world. The data includes human annotations that the team used to benchmark the performance of SWTrack.&nbsp;&nbsp;</p> <p>They found that each time they extended the temporal window, to a maximum of five seconds, the tracking performance got better. But past five seconds, the algorithm’s performance was slowed by computation time.&nbsp;&nbsp;&nbsp;</p> <p>“Most tracking algorithms would have a tough time reasoning over some of these temporal gaps. But in our case, we were able to validate that we can track over these longer periods of time and maintain more consistent tracking for dynamic objects around us,†says Papais.&nbsp;</p> <p>Papais says he’s looking forward to building on the idea of improving robot memory and extending it to other areas of robotics infrastructure.&nbsp;“This is just the beginning,†he says. “We’re working on the tracking problem, but also other robot problems, where we can incorporate more temporal information to enhance perception and robotic reasoning.â€&nbsp;&nbsp;</p> <p>Another paper, <a href="https://arxiv.org/pdf/2402.12303">co-authored by master’s student <strong>Chang Won (John) Lee</strong> and Waslander</a>, introduces UncertaintyTrack, a collection of extensions for 2D tracking-by-detection methods that leverages probabilistic object detection.&nbsp;&nbsp;&nbsp;</p> <p>“Probabilistic object detection quantifies the uncertainty estimates of object detection,†explains Lee. “The key thing here is that for safety-critical tasks, you want to be able to know when&nbsp;the predicted detections are likely to cause errors in downstream tasks such as multi-object tracking. These errors can occur because of low-lighting conditions or heavy object occlusion.&nbsp;&nbsp;</p> <p>“Uncertainty estimates give us an idea of when the model is in doubt, that is, when it is highly likely to give errors in predictions. But there’s this gap because probabilistic object detectors aren’t currently used in multi-tracking object tracking.â€â€¯&nbsp;&nbsp;</p> <p>Lee worked on the paper as part of his undergraduate thesis in engineering science. Now a master’s student in Waslander’s lab, he is researching visual anomaly detection for the Canadarm3, Canada’s contribution to the U.S.-led Gateway lunar outpost.&nbsp;&nbsp;“In my current research, we are aiming to come up with a deep-learning-based method that detects objects floating in space that pose a potential risk to the robotic arm,†Lee says.</p> <p>Waslander says the advancements outlined in the two papers build on work that his lab has been focusing on for a number of years.</p> <p>“[The Toronto Robotics and AI Laboratory] has been working on assessing perception uncertainty and expanding temporal reasoning for robotics for multiple years now, as they are the key roadblocks to deploying robots in the open world more broadly,†Waslander says.</p> <p>“We desperately need AI methods that can understand the persistence of objects over time, and ones that are aware of their own limitations and will stop and reason when something new or unexpected appears in their path. This is what our research aims to do.â€&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Wed, 29 May 2024 14:59:42 +0000 rahul.kalvapalle 307958 at With the launch of its first satellite, student team charts a course to new knowledge /news/launch-its-first-satellite-student-team-charts-course-new-knowledge <span class="field field--name-title field--type-string field--label-hidden">With the launch of its first satellite, student team charts a course to new knowledge</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2024-01/UTAT-Space-Systems-HERON-launch-crop.jpg?h=d082dac7&amp;itok=9Wa3UXmZ 370w, /sites/default/files/styles/news_banner_740/public/2024-01/UTAT-Space-Systems-HERON-launch-crop.jpg?h=d082dac7&amp;itok=JbfGqxc8 740w, /sites/default/files/styles/news_banner_1110/public/2024-01/UTAT-Space-Systems-HERON-launch-crop.jpg?h=d082dac7&amp;itok=FhGwd94z 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2024-01/UTAT-Space-Systems-HERON-launch-crop.jpg?h=d082dac7&amp;itok=9Wa3UXmZ" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>rahul.kalvapalle</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2024-01-19T12:44:03-05:00" title="Friday, January 19, 2024 - 12:44" class="datetime">Fri, 01/19/2024 - 12:44</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p><em>A Falcon 9 rocket lifts off from Vandenberg Space Force Base on Nov. 11, 2023, carrying a satellite designed and built by the&nbsp;¯r¶¹Íø Aerospace Team (photo courtesy of SpaceX)</em></p> </div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/aerospace" hreflang="en">Aerospace</a></div> <div class="field__item"><a href="/news/tags/electrical-computer-engineering" hreflang="en">Electrical &amp; Computer Engineering</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/mechanical-industrial-engineering" hreflang="en">Mechanical &amp; Industrial Engineering</a></div> <div class="field__item"><a href="/news/tags/space" hreflang="en">Space</a></div> <div class="field__item"><a href="/news/tags/undergraduate-students" hreflang="en">Undergraduate Students</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">“We worked on this project for so long with such a narrow focus that actually seeing it deployed was very rewardingâ€</div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Students in the ¯r¶¹Íø’s Faculty of Applied Science &amp; Engineering recently&nbsp;gathered in the basement of the Sandford Fleming Building – known to many as “The Pit†– to witness the deployment of HERON Mk. II into space.&nbsp;&nbsp;</p> <p>The 3U CubeSat satellite, built and operated by the space systems division of the ¯r¶¹Íø Aerospace Team (UTAT), was launched into orbit on a Falcon 9 rocket on Nov. 11, 2023 as part of SpaceX’s Transporter-9 rideshare mission that lifted off from the Vandenberg Space Force Base near Lompoc, Calif.&nbsp;&nbsp;</p> <p>The feat was entirely student funded with support from ¯r¶¹Íø Engineering through student levies and UTAT-led fundraising efforts.&nbsp;&nbsp;&nbsp;</p> <p>“The experience of the launch was very surreal,â€&nbsp;says master’s degree student<strong>&nbsp;Benjamin Nero</strong>, HERON’s current mission manger.&nbsp;&nbsp;</p> <p>“We worked on this project for so long with such a narrow focus that actually seeing it deployed was very rewarding.â€&nbsp;&nbsp;&nbsp;</p> <p>“There are any number of things that could go wrong that might prevent a satellite from deploying,â€&nbsp;adds&nbsp;<strong>Zachary Teper</strong>, a fellow master’s degree candidate<strong>&nbsp;</strong>who is part of the technical development team working on HERON’s ground station.&nbsp;</p> <p>“So, watching each of the call outs coming out of the SpaceX mission control, seeing the rocket go up and meet every one of its mission objectives and then finally seeing our satellite get ejected out of the dispenser in the correct trajectory was a big relief&nbsp;– because we knew that it was finally in space and on the right path.â€&nbsp;</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-01/UTAT-Space-Systems-team-ground-station-crop.jpg?itok=fBLrHH7z" width="750" height="500" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>Members of the UTAT space systems division gather on the sixth-floor roof of the Bahen Centre for Information Technology with the fully assembled ground station (photo by UTAT Space Systems)</em></figcaption> </figure> <p>Launching HERON – short for High frequency Educational Radio communications On a Nanosatellite –&nbsp;was the culmination of years of teamwork that brought together the efforts of more than 100 students.&nbsp;</p> <p>HERON Mk. II, the second iteration of UTAT’s spacecraft, was originally designed and built between 2016 and 2018 for the fourth edition of the <a href="https://www.ic.gc.ca/eic/site/060.nsf/vwapj/CSDCMS.pdf/$file/CSDCMS.pdf">Canadian Satellite Design Challenge</a>.&nbsp;Since space systems division was formed in 2014, many of the students who worked on the initial HERON design and build have since graduated. But the current operations team continued to develop the satellite and renew the student levy that allowed them to secure their space launch.&nbsp;&nbsp;</p> <p>“The original objective for HERON was to conduct a biology experiment in space,†says Nero, who joined the team in 2019 during his second year of undergraduate studies.&nbsp;“But because of delays in the licensing process, we were unable to continue that mission objective. So, we re-scoped and shifted our focus to amateur radio communication and knowledge building.â€&nbsp;&nbsp;</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-01/5-crop.jpg?itok=pLDFm8_s" width="750" height="422" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>From left to right: HERON Mk. I (2016), HERON Mk. II Prototype (2018), HERON Mk. II Softstack (2020), HERON Mk. II Flight Model (2021) (photos by UTAT Space Systems)</em></figcaption> </figure> <p>Once the satellite’s final assembly was completed in 2021, the team began flight model testing and assembling a ground station, while also managing the logistics of the regulatory approvals needed to complete the launch.&nbsp;&nbsp;</p> <p>“It’s difficult to put something in space, both technically and bureaucratically,†says Nero. “There are a lot of different governments that care about what you’re doing and want to know when and how you’re doing it.â€&nbsp;</p> <p>Getting to space was a significant milestone for the team, but it’s still only the beginning of their work.&nbsp;</p> <p>“The goal for us as a design team is to start gathering institutional knowledge that we didn’t have before,†says&nbsp;<strong>Reid Sox-Harris</strong>, an undergraduate student&nbsp;who is HERON’s ground station manager and the electrical lead for UTAT’s next space mission, FINCH&nbsp;(Field Imaging Nanosatellite for Crop residue Hyperspectral mapping).&nbsp;</p> <p>“We’ve never operated a satellite. So, we’re taking a lot of lessons learned with us through this process.â€â€¯&nbsp;&nbsp;</p> <p>For example, when a satellite is deployed for the first time, the ground control team only has a rough idea of its movement and eventual location. They must simulate the launch to figure out exactly where it is before they can establish a connection. And when they receive new positional data, they must rerun their simulation.&nbsp;&nbsp;</p> <p>“We have to take into account effects such as air resistance, or the sun’s solar cycles and the gravitational effects of the sun, the moon and the Earth – it’s a fairly complicated simulation,†Sox-Harris says.&nbsp;<br> <br> Nero adds: “Part of the difficulty with a simulation is that a model is only useful for a certain period. An old estimate could result in as much as a few kilometres of drift from the satellite’s actual position per day.â€</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-01/HERON-gs_937-crop.jpg?itok=FpwF15sA" width="750" height="500" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>HERON’s ground station on the roof of the Bahen Centre (photo by UTAT Space Systems)</em></figcaption> </figure> <p>The team was not only tasked with designing a ground station capable of communicating with a satellite more than 500 kilometres away, but one that can survive a frigid and snowy Canadian winter.</p> <p>“For any project, the most important thing you should be doing is testing,†says second-year student&nbsp;<strong>Swarnava Ghosh</strong>, who primarily works on the ground station software.&nbsp;&nbsp;“One challenge with our ground station currently is that there are too many variables that are not fully tested – and everything needs to be perfect in the chain for the communication to work. If the ground station is not pointing in the right direction, we won’t get a signal and we won’t establish communication. And if the amplifier is not working, then we won’t establish communication.â€â€¯&nbsp;</p> <p>The team is confident that they will ultimately resolve any outstanding issues and establish communications with HERON. More importantly, they will be able to take what they’ve learned and apply it to the next&nbsp;mission.</p> <p>“With FINCH, we want to make sure the&nbsp;ground station software and satellite can communicate on the ground,†says Sox-Harris. “Right now, there are over 500 kilometres between the satellite and ground station, so we can’t fly up there and test whether a command has worked.â€&nbsp;</p> <p>FINCH is set to launch in late 2025 on a rideshare rocket flight. Its&nbsp;current mission objective is to generate hyperspectral imaging maps of crop residue on farm fields in Manitoba from a low-Earth orbit.&nbsp;&nbsp;&nbsp;</p> <p>There are many technical developments that are new to FINCH that weren’t applicable to HERON, the team says, including a novel optic system for remote sensing that is being developed by students.&nbsp;&nbsp;</p> <p>“The risks associated with FINCH are mitigated by the work that is being performed by HERON right now.&nbsp;We’re learning many lessons that&nbsp;will be directly applicable to our next mission, and we’ll continue to learn from HERON for at least another year or more,†says Sox-Harris.&nbsp;</p> <p>“This means the FINCH mission can be more complicated, it can move faster and ultimately we can have better reliability, which is something that we always strive for in aerospace.â€&nbsp;</p> <p>&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Fri, 19 Jan 2024 17:44:03 +0000 rahul.kalvapalle 305347 at ¯r¶¹Íø researchers partner with Siemens Energy to tackle sustainable energy production /news/u-t-researchers-partner-siemens-energy-tackle-sustainable-energy-production <span class="field field--name-title field--type-string field--label-hidden">¯r¶¹Íø researchers partner with Siemens Energy to tackle sustainable energy production</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2024-01/MicrosoftTeams-image-%287%29-crop.jpg?h=81d682ee&amp;itok=iaFYfLIx 370w, /sites/default/files/styles/news_banner_740/public/2024-01/MicrosoftTeams-image-%287%29-crop.jpg?h=81d682ee&amp;itok=RYJSYC_g 740w, /sites/default/files/styles/news_banner_1110/public/2024-01/MicrosoftTeams-image-%287%29-crop.jpg?h=81d682ee&amp;itok=vqhA3Qfg 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2024-01/MicrosoftTeams-image-%287%29-crop.jpg?h=81d682ee&amp;itok=iaFYfLIx" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2024-01-10T14:50:56-05:00" title="Wednesday, January 10, 2024 - 14:50" class="datetime">Wed, 01/10/2024 - 14:50</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p><em>PhD student Yazdan Naderzadeh (left) investigates flames with lasers in the Propulsion and Energy Conversion Lab at UTIAS (photo by Neil Ta)</em></p> </div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/selah-katona" hreflang="en">Selah Katona</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/industry-partnerships" hreflang="en">Industry Partnerships</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/sustainability" hreflang="en">Sustainability</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">'Together, we hope to unravel the complexities of hydrogen combustion, paving the way for cleaner and more efficient engines'</div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Researchers in the ¯r¶¹Íø’s Faculty of Applied Science &amp; Engineering have partnered with Siemens Energy to tackle a key challenge in the energy sector: sustainable energy conversion for propulsion and power generation – such as developing gas turbine engines that can run on sustainable energy sources like hydrogen.</p> <p>Led by Associate Professor&nbsp;<strong>Swetaprovo Chaudhuri</strong>&nbsp;from the ¯r¶¹Íø Institute of Aerospace Studies (UTIAS), the initiative aims to rethink traditional gas turbine engines to reduce carbon emissions from both aviation and land-based fuel consumption.&nbsp;&nbsp;</p> <p>Chaudhuri’s team is exploring hydrogen combustion as a viable option since it can be burned without producing carbon dioxide.</p> <p>However, the transition is not without its challenges. For one, hydrogen is a small, highly reactive molecule, causing flames to move five to ten times faster than those of natural gas. This makes existing combustors and engines that run on natural gas incapable of handling pure hydrogen.&nbsp;</p> <p>Another key challenge is the lack of infrastructure available to transport hydrogen in the way pipelines are used to move natural gas. Until such infrastructure is developed, Chaudhuri’s team is researching how to build reliable fuel-flex gas turbine engines that can work on both fuels.&nbsp;&nbsp;</p> <p>“Hydrogen and natural gas are vastly different - it’s like comparing a Bugatti Veyron to a public bus in both speed and size,†says Chaudhuri, who leads the Propulsion &amp; Energy Conversion Laboratory at UTIAS. “The critical question is: ‘how can engines be designed to accommodate both fuels seamlessly?’â€&nbsp;</p> <p>The team is led by Chaudhuri in collaboration with Associate Professor <strong>Jeff Bergthorson</strong> at McGill University, Professor&nbsp;<strong>Étienne Robert</strong>&nbsp;and Assistant Professor&nbsp;<strong>Bruno Savard</strong>&nbsp;at Polytechnique Montréal, <strong>Patrizio Vena</strong> at National Research Council Canada and engineers from Siemens Energy Canada in Montreal.</p> <p>The project&nbsp;received an Alliance Mission Grant from the Natural Sciences and Engineering Research Council (NSERC)&nbsp;to build a comprehensive understanding that will guide the creation of fuel-flex gas turbine engines.</p> <figure role="group" class="caption caption-drupal-media align-center"> <div> <div class="field field--name-field-media-image field--type-image field--label-hidden field__item"> <img loading="lazy" src="/sites/default/files/styles/scale_image_750_width_/public/2024-01/MicrosoftTeams-image-%288%29-crop.jpg?itok=IkkOJvxr" width="750" height="501" alt="&quot;&quot;" class="image-style-scale-image-750-width-"> </div> </div> <figcaption><em>PhD candidate Yazdan Naderzadeh (left) and master’s student Scott Watson from the Propulsion and Energy Conversion Lab work with a swirling hydrogen flame (photo Praful Kumar)</em></figcaption> </figure> <p>The researchers have constructed a model lab-scale combustor at the Propulsion and Energy Conversion Laboratory at UTIAS, to study the behaviour of natural gas and hydrogen flames within engines. These experiments aim to understand the intricacies of hydrogen combustion to establish engineering principles and guidelines for future engine development.&nbsp;&nbsp;</p> <p>While practical applications are on the horizon, the immediate goal is to establish a robust knowledge base that will be essential for designing engines that can efficiently and safely use hydrogen as a fuel source.&nbsp;&nbsp;</p> <p>“Currently, long-range aircrafts cannot, even theoretically, fly on batteries. We need to make significant strides towards combustion engines that use hydrogen or other carbon-neutral fuels to substantially reduce carbon emissions in these critical sectors,†says Chaudhuri.&nbsp;</p> <p>In a different, stand-alone project, Chaudhuri and his research group are developing a self-decarbonizing combustor, which separates hydrogen and carbon from natural gas within the combustor. This process not only allows for hydrogen to be used for fuel but could also allow the carbon byproduct to be used to offset the additional cost associated with decarbonization.&nbsp;&nbsp;</p> <p>“Our collaboration with Siemens Energy marks an exciting synergy between academia and industry,†says Chaudhuri. “Siemens Energy’s gas turbines for generating power have historically used natural gas, so this partnership represents a significant step towards a greener future.</p> <p>“Together, we hope to unravel the complexities of hydrogen combustion, paving the way for cleaner and more efficient engines.â€&nbsp;</p> <p>The development and commissioning of the fuel-flex combustor, capable of safely stabilizing both hydrogen and natural gas flames, presents numerous research opportunities for students.</p> <p><strong>Yazdan Naderzadeh</strong> and <strong>Scott Watson</strong>, a PhD candidate and master’s student respectively in Chaudhuri’s lab, are working on the project. “I am so excited to work on the ongoing fuel-flex combustor project, addressing concerns related to clean emissions and compatibility with conventional gas turbine burners,†says Naderzadeh. “This endeavor allows for a thorough study and understanding of the challenges associated with hydrogen as a prospective fuel in the aviation industry and gas power plants.â€</p> <h3><a href="https://bluedoor.utoronto.ca/">Learn more about industry partnerships at ¯r¶¹Íø</a></h3> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Wed, 10 Jan 2024 19:50:56 +0000 Christopher.Sorensen 305214 at AI algorithm improves predictive models of complex dynamical systems /news/ai-algorithm-improves-predictive-models-complex-dynamical-systems <span class="field field--name-title field--type-string field--label-hidden">AI algorithm improves predictive models of complex dynamical systems</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/2023-11/Nair-Course-photo.jpg?h=afdc3185&amp;itok=Lt8It-CP 370w, /sites/default/files/styles/news_banner_740/public/2023-11/Nair-Course-photo.jpg?h=afdc3185&amp;itok=b2uZAIrk 740w, /sites/default/files/styles/news_banner_1110/public/2023-11/Nair-Course-photo.jpg?h=afdc3185&amp;itok=Wdk23c30 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/2023-11/Nair-Course-photo.jpg?h=afdc3185&amp;itok=Lt8It-CP" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2023-11-15T09:20:37-05:00" title="Wednesday, November 15, 2023 - 09:20" class="datetime">Wed, 11/15/2023 - 09:20</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item"><p><em>From left to right: Professor Prasanth Nair and PhD student Kevin Course are the authors of a new paper in Nature that introduces&nbsp;a new machine learning algorithm that addresses the challenge of imperfect knowledge about system dynamics&nbsp;(supplied images)</em></p> </div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/breaking-research" hreflang="en">Breaking Research</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="field field--name-field-subheadline field--type-string-long field--label-above"> <div class="field__label">Subheadline</div> <div class="field__item">Developed by ¯r¶¹Íø researchers, the new approach could have applications ranging from predicting the performance of aircraft engines to forecasting climate change</div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>Researchers at&nbsp;the&nbsp;¯r¶¹Íø have made a significant step towards enabling reliable predictions of complex dynamical systems when there are many uncertainties in the available data or missing information.</p> <p>In <a href="https://www.nature.com/articles/s41586-023-06574-8">a&nbsp;recent paper published in&nbsp;<em>Nature</em></a>, <strong>Prasanth B. Nair</strong>, a professor at the ¯r¶¹Íø Institute&nbsp;of Aerospace Studies (UTIAS) in the Faculty of Applied Science &amp; Engineering, and UTIAS PhD candidate <strong>Kevin&nbsp;Course</strong>&nbsp;introduced a new machine learning algorithm that surmounts the real-world challenge of imperfect knowledge about system dynamics. The computer-based mathematical modelling approach is used for problem solving and better decision making in complex systems, where many components interact with each other.&nbsp;&nbsp;</p> <p>The researchers say the work could have numerous applications ranging from predicting the performance of aircraft engines to forecasting changes in global climate or the spread of viruses.&nbsp;&nbsp;</p> <p>“For the first time, we are able to apply state estimation to problems where we don’t know the governing equations, or the governing equations have a lot of missing terms,†says Course, who is the paper’s first author.&nbsp;&nbsp;&nbsp;</p> <p>“In contrast to standard techniques, which usually require a state estimate to infer the governing equations and vice-versa, our method learns the missing terms in the mathematical model and a state estimate simultaneously.â€&nbsp;&nbsp;</p> <p>State estimation, also known as data assimilation, refers to the process of combining observational data with computer models to estimate the current state of a system. Traditionally, it requires strong assumptions about the type of uncertainties that exist in a mathematical model.&nbsp;&nbsp;&nbsp;</p> <p>“For example, let’s say you have constructed a computer model that predicts the weather and at the same time, you have access to real-time data from weather stations providing actual temperature readings,†says Nair. “Due to the model’s inherent limitations and simplifications – which is often unavoidable when dealing with complex real-world systems – the model predictions may not match the actual observed temperature you are seeing.&nbsp;&nbsp;</p> <p>“State estimation combines the model’s prediction with the actual observations to provide a corrected or better-calibrated estimate of the current temperature. It effectively assimilates the data into the model to correct its state.â€&nbsp;&nbsp;</p> <p>However, it has been previously difficult to estimate the underlying state of complex dynamical systems in situations where the governing equations are completely or partially unknown. The new algorithm provides a rigorous statistical framework to address this long-standing problem.&nbsp;&nbsp;</p> <p>“This problem is akin to deciphering the ‘laws’ that a system obeys without having explicit knowledge about them,†says Nair, whose research group is developing algorithms for mathematical modelling of systems and phenomena that are encountered in various areas of engineering and science.&nbsp;&nbsp;</p> <p>A byproduct of Course and Nair’s algorithm is that it also helps to characterize missing terms or even the entirety of the governing equations, which determine how the values of unknown variables change when one or more of the known variables change.&nbsp;&nbsp;&nbsp;</p> <p>The main innovation underpinning the work is a reparametrization trick for stochastic variational inference with Markov Gaussian processes that enables an approximate Bayesian approach to solve such problems. This new development allows researchers to&nbsp;deduce the equations that govern the dynamics of complex systems and arrive at a state estimate using indirect and “noisy†measurements.&nbsp;&nbsp;</p> <p>“Our approach is computationally attractive since it leverages stochastic&nbsp;– that is randomly determined&nbsp;– approximations that can be efficiently computed in parallel and, in addition, it does not rely on computationally expensive forward solvers in training,†says Course.&nbsp;&nbsp;&nbsp;</p> <p>While Course and Nair approached their research from a theoretical viewpoint, they were able to demonstrate practical impact by applying their algorithm to problems ranging from modelling fluid flow to predicting the motion of black holes.&nbsp;&nbsp;&nbsp;</p> <p>“Our work is relevant to several branches of sciences, engineering and finance as researchers from these fields often interact with systems where first-principles models are difficult to construct or existing models are insufficient to explain system behaviour,†says Nair.&nbsp;&nbsp;</p> <p>“We believe this work will open the door for practitioners in these fields to better intuit the systems they study,†adds Course. “Even in situations where high-fidelity mathematical models are available, this work can be used for probabilistic model calibration and to discover missing physics in existing models.&nbsp;&nbsp;&nbsp;</p> <p>“We have also been able to successfully use our approach to efficiently train neural stochastic differential equations, which is a type of machine learning model that has shown promising performance for time-series datasets.â€&nbsp;&nbsp;&nbsp;&nbsp;</p> <p>While the paper primarily addresses challenges in state estimation and governing equation discovery, the researchers say it provides a general groundwork for robust data-driven techniques in computational science and engineering.&nbsp;&nbsp;</p> <p>“As an example,&nbsp;our research group is currently using this framework to construct probabilistic reduced-order models of complex systems. We hope to expedite decision-making processes integral to the optimal design, operation and control of real-world systems,†says Nair.&nbsp;&nbsp;&nbsp;</p> <p>“Additionally, we are also studying how the inference methods stemming from our research may offer deeper statistical insights into stochastic differential equation-based generative models that are now widely used in many artificial intelligence applications.â€&nbsp;&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Wed, 15 Nov 2023 14:20:37 +0000 Christopher.Sorensen 304471 at Researchers help robots navigate crowded spaces with new visual perception method /news/researchers-help-robots-navigate-crowded-spaces-new-visual-perception-method <span class="field field--name-title field--type-string field--label-hidden">Researchers help robots navigate crowded spaces with new visual perception method</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/iStock-1279493735-crop.jpg?h=afdc3185&amp;itok=FnXXVi6F 370w, /sites/default/files/styles/news_banner_740/public/iStock-1279493735-crop.jpg?h=afdc3185&amp;itok=7k3rU_TC 740w, /sites/default/files/styles/news_banner_1110/public/iStock-1279493735-crop.jpg?h=afdc3185&amp;itok=mtI0yfdN 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/iStock-1279493735-crop.jpg?h=afdc3185&amp;itok=FnXXVi6F" alt="crowded downtown city street with many people walking across an intersection"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-11-09T15:10:52-05:00" title="Wednesday, November 9, 2022 - 15:10" class="datetime">Wed, 11/09/2022 - 15:10</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">Researchers from the ¯r¶¹Íø Institute for Aerospace Studies have developed a system that improves how robots stitch together a set of images taken from a moving camera to build a 3D model of their environments (photo by iStock/LeoPatrizi)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/breaking-research" hreflang="en">Breaking Research</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/alumni" hreflang="en">Alumni</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>A team of researchers at the ¯r¶¹Íø&nbsp;has found a way to enhance the visual perception of robotic systems by coupling two different types of neural networks.</p> <p>The innovation could help autonomous vehicles navigate busy streets or enable medical robots to work effectively in crowded hospital hallways.&nbsp;</p> <p>“What tends to happen in our field is that when systems don’t perform as expected, the designers make the networks bigger – they add more parameters,†says <strong>Jonathan Kelly</strong>, an assistant professor at the&nbsp;<a href="https://www.utias.utoronto.ca/">¯r¶¹Íø Institute for Aerospace Studies</a> in the Faculty of Applied Science &amp; Engineering.</p> <p>“What we’ve done instead is to carefully study how the pieces should fit together. Specifically, we investigated how two pieces of the motion estimation problem – accurate perception of depth and motion – can be joined together in a robust way.â€&nbsp;&nbsp;</p> <p>Researchers in Kelly’s&nbsp;<a href="https://starslab.ca/">Space and Terrestrial Autonomous Robotic Systems</a>&nbsp;lab aim to build reliable systems that can help humans accomplish a variety of tasks. For example, they’ve designed&nbsp;<a href="https://news.engineering.utoronto.ca/wheelchairs-get-robotic-retrofit-become-self-driving/">an electric wheelchair that can automate some common tasks</a>&nbsp;such as navigating through doorways.&nbsp;&nbsp;</p> <p>More recently, they’ve focused on techniques that will help robots move out of the carefully controlled environments in which they are commonly used today and into the less predictable world&nbsp;humans are accustomed to navigating.&nbsp;&nbsp;</p> <p>“Ultimately, we are looking to develop situational awareness for highly dynamic environments where people operate, whether it’s a crowded hospital hallway, a busy public square&nbsp;or a city street full of traffic and pedestrians,†says Kelly.&nbsp;&nbsp;</p> <p>One challenging problem that robots must solve in all of these spaces is known to the robotics community as “structure from motion.†This is the process by which robots stitch together a set of images taken from a moving camera to build a 3D model of the environment they are in. The process is analogous to the way humans use their eyes to perceive the world around them.&nbsp;&nbsp;</p> <p>In today’s robotic systems, structure from motion is typically achieved in two steps, each of which uses different information from a set of monocular images. One is depth perception, which tells the robot how far away the objects in its field of vision are. The other, known as egomotion, describes the 3D movement of the robot in relation to its environment.&nbsp;</p> <p>“Any robot navigating within a space needs to know how far static and dynamic objects are in relation to itself, as well as how its motion changes a scene,†says Kelly. “For example, when a train moves along a track, a passenger looking out a window can observe that objects at a distance appear to move slowly, while objects nearby zoom past.â€&nbsp;&nbsp;</p> <p>&nbsp;</p> <div class="media_embed" height="500px" width="750px"><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen frameborder="0" height="500px" src="https://www.youtube.com/embed/8Oij81bEoH0" title="YouTube video player" width="750px"></iframe></div> <p>&nbsp;</p> <p>The challenge is that in many current systems, depth estimation is separated from motion estimation – there is no explicit sharing of information between the two neural networks. Joining depth and motion estimation together ensures that each&nbsp;is consistent with the other.&nbsp;&nbsp;&nbsp;</p> <p>“There are constraints on depth that are defined by motion, and there are constraints on motion that are defined by depth,†says Kelly. “If the system doesn’t couple these two neural network components, then&nbsp;the end result is an inaccurate estimate of where everything is in the world and where the robot is in relation.â€&nbsp;</p> <p>In a recent study, two of Kelly’s&nbsp;students –&nbsp;<strong>Brandon Wagstaff</strong>, a PhD candidate, and former PhD student&nbsp;<strong>Valentin Peretroukhin</strong>&nbsp;–&nbsp;investigated and improved on existing structure from motion methods.&nbsp;</p> <p>Their new system makes the egomotion prediction a function of depth, increasing the system’s overall accuracy and reliability.&nbsp;<a href="https://www.youtube.com/watch?v=6QEDCooyUjE">They recently presented their work</a> at the International Conference on Intelligent Robots and Systems (IROS) in Kyoto, Japan.&nbsp;&nbsp;</p> <p>“Compared with existing learning-based methods, our new system was able to reduce the motion estimation error by approximately 50 per cent,†says Wagstaff.&nbsp;&nbsp;</p> <p>“This improvement in motion estimation accuracy was demonstrated not only on data similar to that used to train the network, but also on significantly different forms of data, indicating that the proposed method was able to generalize across many different environments.â€&nbsp;</p> <p>Maintaining accuracy when operating within novel environments is challenging for neural networks. The team has since expanded their research beyond visual motion estimation to include inertial sensing – an extra sensor that is akin to the vestibular system in the human ear.&nbsp;&nbsp;</p> <p>“We are now working on robotic applications that can mimic a human’s eyes and inner ears, which provides information about balance, motion and acceleration,†says Kelly.&nbsp;&nbsp;&nbsp;</p> <p>“This will enable even more accurate motion estimation to handle situations like dramatic scene changes — such as an environment suddenly getting darker when a car enters a tunnel, or a camera failing when it looks directly into the sun.â€&nbsp;&nbsp;</p> <p>The potential applications for such new approaches are diverse, from improving the handling of self-driving vehicles to enabling aerial drones to fly safely through crowded environments to deliver goods or carry out environmental monitoring.&nbsp;&nbsp;</p> <p>“We are not building machines that are left in cages,†says Kelly. “We want to design robust robots that can move safely around people and environments.â€&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Wed, 09 Nov 2022 20:10:52 +0000 Christopher.Sorensen 177980 at ¯r¶¹Íø student team takes first place at International Small Wind Turbine Contest /news/u-t-student-team-takes-first-place-international-small-wind-turbine-contest <span class="field field--name-title field--type-string field--label-hidden">¯r¶¹Íø student team takes first place at International Small Wind Turbine Contest</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/UTWind_DesignPhoto-crop.jpg?h=afdc3185&amp;itok=3vOUHy6F 370w, /sites/default/files/styles/news_banner_740/public/UTWind_DesignPhoto-crop.jpg?h=afdc3185&amp;itok=tBVfRCqT 740w, /sites/default/files/styles/news_banner_1110/public/UTWind_DesignPhoto-crop.jpg?h=afdc3185&amp;itok=4m2zo9jM 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/UTWind_DesignPhoto-crop.jpg?h=afdc3185&amp;itok=3vOUHy6F" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-07-04T11:24:48-04:00" title="Monday, July 4, 2022 - 11:24" class="datetime">Mon, 07/04/2022 - 11:24</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">The UTWind team stands next to their winning prototype turbine at the Open Jet Facility wind tunnel at Delft University of Technology (photo by Niels Adema/Hanze University of Applied Sciences)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/tyler-irving" hreflang="en">Tyler Irving</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/global-lens" hreflang="en">Global Lens</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/alumni" hreflang="en">Alumni</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/global" hreflang="en">Global</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/sustainability" hreflang="en">Sustainability</a></div> <div class="field__item"><a href="/news/tags/undergraduate-students" hreflang="en">Undergraduate Students</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>In their first-ever competition,&nbsp;UTWind&nbsp;– <a href="https://www.utwind.com/">a team of undergraduate and graduate students</a> from the ¯r¶¹Íø’s Faculty of Applied Science &amp; Engineering – has taken the top prize in an international challenge to design and build a small-scale wind turbine.</p> <p>“While we always strived to be a competitive team from the beginning and knew that we had a strong design, we definitely didn’t expect to win first place,†says&nbsp;<strong>David Petriw</strong>, a third-year materials science and engineering student who is&nbsp;a member of UTWind.</p> <p>“The morale of the team is at an all-time high, and we are going to celebrate this win in a big way.â€</p> <p><a href="https://www.hanze.nl/nld/onderwijs/techniek/instituut-voor-engineering/organisatie/contest/international-small-wind-turbine-contest">The&nbsp;International Small Wind Turbine Contest</a> (ISWTC)&nbsp;is hosted annually by Hanze University of Applied Sciences in Groningen, Netherlands. To clinch first place, UTWind edged out teams from Denmark, Germany, Poland and Egypt.</p> <p>“The goal of ISWTC is to build and demonstrate a wind turbine designed for rural regions in Sub-Saharan Africa,†says&nbsp;<strong>Andrew Ilersich</strong>, a<b>&nbsp;</b>PhD candidate at the ¯r¶¹Íø Institute for Aerospace Studies (UTIAS) and&nbsp;aerodynamics lead for UTWind.</p> <p>“Every aspect of our design had to be tailored to, or at least compatible with, the region it would be sold and operated in. We also had to show that our design was sustainable, being made from recyclable, low-cost, and locally available materials.â€</p> <p>Unlike the large turbines used in commercial wind farms, which can rise to over 100 metres and generate megawatts of power each, small wind turbines (SWTs) are designed for generation on scales from a few hundred watts to a few kilowatts.</p> <p>To win the contest, teams must demonstrate top-of-class performance across a number of criteria, including power generation, cut-in speed, estimated annual energy production and coefficient of power, which is a measure of the turbine’s efficiency.</p> <p>Performance was measured at the Open Jet Facility wind tunnel at Delft University of Technology. After that, the teams headed to the&nbsp;Science of Making Torque Conference&nbsp;in Delft, to present their business case.</p> <p>&nbsp;</p> <div class="media_embed" width="1px"> <blockquote class="instagram-media" data-instgrm-captioned data-instgrm-permalink="https://www.instagram.com/p/CefDcCnKk_Y/?utm_source=ig_embed&amp;utm_campaign=loading" data-instgrm-version="14" height style=" background:#FFF; border:0; border-radius:3px; box-shadow:0 0 1px 0 rgba(0,0,0,0.5),0 1px 10px 0 rgba(0,0,0,0.15); margin: 1px; max-width:540px; min-width:326px; padding:0; width:99.375%; width:-webkit-calc(100% - 2px); width:calc(100% - 2px);" width="1px"> <div style="padding:16px;"> <div style=" display: flex; flex-direction: row; align-items: center;"> <div style="background-color: #F4F4F4; border-radius: 50%; flex-grow: 0; height: 40px; margin-right: 14px; width: 40px;">&nbsp;</div> <div style="display: flex; flex-direction: column; flex-grow: 1; justify-content: center;"> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; margin-bottom: 6px; width: 100px;">&nbsp;</div> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; width: 60px;">&nbsp;</div> </div> </div> <div style="padding: 19% 0;">&nbsp;</div> <div style="display:block; height:50px; margin:0 auto 12px; width:50px;"><a href="https://www.instagram.com/p/CefDcCnKk_Y/?utm_source=ig_embed&amp;utm_campaign=loading" style=" background:#FFFFFF; line-height:0; padding:0 0; text-align:center; text-decoration:none; width:100%;" target="_blank"><svg height="50px" version="1.1" viewBox="0 0 60 60" width="50px" xmlns:xlink="https://www.w3.org/1999/xlink"><g fill="none" fill-rule="evenodd" stroke="none" stroke-width="1"><g fill="#000000" transform="translate(-511.000000, -20.000000)"><g><path d="M556.869,30.41 C554.814,30.41 553.148,32.076 553.148,34.131 C553.148,36.186 554.814,37.852 556.869,37.852 C558.924,37.852 560.59,36.186 560.59,34.131 C560.59,32.076 558.924,30.41 556.869,30.41 M541,60.657 C535.114,60.657 530.342,55.887 530.342,50 C530.342,44.114 535.114,39.342 541,39.342 C546.887,39.342 551.658,44.114 551.658,50 C551.658,55.887 546.887,60.657 541,60.657 M541,33.886 C532.1,33.886 524.886,41.1 524.886,50 C524.886,58.899 532.1,66.113 541,66.113 C549.9,66.113 557.115,58.899 557.115,50 C557.115,41.1 549.9,33.886 541,33.886 M565.378,62.101 C565.244,65.022 564.756,66.606 564.346,67.663 C563.803,69.06 563.154,70.057 562.106,71.106 C561.058,72.155 560.06,72.803 558.662,73.347 C557.607,73.757 556.021,74.244 553.102,74.378 C549.944,74.521 548.997,74.552 541,74.552 C533.003,74.552 532.056,74.521 528.898,74.378 C525.979,74.244 524.393,73.757 523.338,73.347 C521.94,72.803 520.942,72.155 519.894,71.106 C518.846,70.057 518.197,69.06 517.654,67.663 C517.244,66.606 516.755,65.022 516.623,62.101 C516.479,58.943 516.448,57.996 516.448,50 C516.448,42.003 516.479,41.056 516.623,37.899 C516.755,34.978 517.244,33.391 517.654,32.338 C518.197,30.938 518.846,29.942 519.894,28.894 C520.942,27.846 521.94,27.196 523.338,26.654 C524.393,26.244 525.979,25.756 528.898,25.623 C532.057,25.479 533.004,25.448 541,25.448 C548.997,25.448 549.943,25.479 553.102,25.623 C556.021,25.756 557.607,26.244 558.662,26.654 C560.06,27.196 561.058,27.846 562.106,28.894 C563.154,29.942 563.803,30.938 564.346,32.338 C564.756,33.391 565.244,34.978 565.378,37.899 C565.522,41.056 565.552,42.003 565.552,50 C565.552,57.996 565.522,58.943 565.378,62.101 M570.82,37.631 C570.674,34.438 570.167,32.258 569.425,30.349 C568.659,28.377 567.633,26.702 565.965,25.035 C564.297,23.368 562.623,22.342 560.652,21.575 C558.743,20.834 556.562,20.326 553.369,20.18 C550.169,20.033 549.148,20 541,20 C532.853,20 531.831,20.033 528.631,20.18 C525.438,20.326 523.257,20.834 521.349,21.575 C519.376,22.342 517.703,23.368 516.035,25.035 C514.368,26.702 513.342,28.377 512.574,30.349 C511.834,32.258 511.326,34.438 511.181,37.631 C511.035,40.831 511,41.851 511,50 C511,58.147 511.035,59.17 511.181,62.369 C511.326,65.562 511.834,67.743 512.574,69.651 C513.342,71.625 514.368,73.296 516.035,74.965 C517.703,76.634 519.376,77.658 521.349,78.425 C523.257,79.167 525.438,79.673 528.631,79.82 C531.831,79.965 532.853,80.001 541,80.001 C549.148,80.001 550.169,79.965 553.369,79.82 C556.562,79.673 558.743,79.167 560.652,78.425 C562.623,77.658 564.297,76.634 565.965,74.965 C567.633,73.296 568.659,71.625 569.425,69.651 C570.167,67.743 570.674,65.562 570.82,62.369 C570.966,59.17 571,58.147 571,50 C571,41.851 570.966,40.831 570.82,37.631" /></g></g></g></svg></a></div> <div style="padding-top: 8px;"> <div style=" color:#3897f0; font-family:Arial,sans-serif; font-size:14px; font-style:normal; font-weight:550; line-height:18px;"><a href="https://www.instagram.com/p/CefDcCnKk_Y/?utm_source=ig_embed&amp;utm_campaign=loading" style=" background:#FFFFFF; line-height:0; padding:0 0; text-align:center; text-decoration:none; width:100%;" target="_blank">View this post on Instagram</a></div> </div> <div style="padding: 12.5% 0;">&nbsp;</div> <div style="display: flex; flex-direction: row; margin-bottom: 14px; align-items: center;"> <div> <div style="background-color: #F4F4F4; border-radius: 50%; height: 12.5px; width: 12.5px; transform: translateX(0px) translateY(7px);">&nbsp;</div> <div style="background-color: #F4F4F4; height: 12.5px; transform: rotate(-45deg) translateX(3px) translateY(1px); width: 12.5px; flex-grow: 0; margin-right: 14px; margin-left: 2px;">&nbsp;</div> <div style="background-color: #F4F4F4; border-radius: 50%; height: 12.5px; width: 12.5px; transform: translateX(9px) translateY(-18px);">&nbsp;</div> </div> <div style="margin-left: 8px;"> <div style=" background-color: #F4F4F4; border-radius: 50%; flex-grow: 0; height: 20px; width: 20px;">&nbsp;</div> <div style=" width: 0; height: 0; border-top: 2px solid transparent; border-left: 6px solid #f4f4f4; border-bottom: 2px solid transparent; transform: translateX(16px) translateY(-4px) rotate(30deg)">&nbsp;</div> </div> <div style="margin-left: auto;"> <div style=" width: 0px; border-top: 8px solid #F4F4F4; border-right: 8px solid transparent; transform: translateY(16px);">&nbsp;</div> <div style=" background-color: #F4F4F4; flex-grow: 0; height: 12px; width: 16px; transform: translateY(-4px);">&nbsp;</div> <div style=" width: 0; height: 0; border-top: 8px solid #F4F4F4; border-left: 8px solid transparent; transform: translateY(-4px) translateX(8px);">&nbsp;</div> </div> </div> <div style="display: flex; flex-direction: column; flex-grow: 1; justify-content: center; margin-bottom: 24px;"> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; margin-bottom: 6px; width: 224px;">&nbsp;</div> <div style=" background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; width: 144px;">&nbsp;</div> </div> <p style=" color:#c9c8cd; font-family:Arial,sans-serif; font-size:14px; line-height:17px; margin-bottom:0; margin-top:8px; overflow:hidden; padding:8px 0 7px; text-align:center; text-overflow:ellipsis; white-space:nowrap;"><a href="https://www.instagram.com/p/CefDcCnKk_Y/?utm_source=ig_embed&amp;utm_campaign=loading" style=" color:#c9c8cd; font-family:Arial,sans-serif; font-size:14px; font-style:normal; font-weight:normal; line-height:17px; text-decoration:none;" target="_blank">A post shared by UTWind (@utwindclub)</a></p> </div> </blockquote> <script async height src="//www.instagram.com/embed.js" width="1px"></script></div> <p>&nbsp;</p> <p>The process of creating the prototype took more than a year from start to finish.</p> <p>“We began the design phase in the beginning of 2021 and the whole assembly was built in winter semester 2022,†says&nbsp;<strong>Ashley Best</strong>, a third-year student in materials science and engineering who is<strong>&nbsp;</strong>media team lead for UTWind.</p> <p>“Our turbine is made from wood and 3D-printed plastics. A few parts were outsourced to our sponsoring machine shop, Protocase, but the majority of the fabrication was done in house by our team – 3D printing, laser cutting, drill pressing, lathing, milling and assembly.â€</p> <p>“One of the things that set our team apart was our high coefficient of power, even when operating at very low wind speeds,†says <strong>Suraj Bansal</strong>,<strong>&nbsp;</strong>UTWind co-president and technical adviser and a PhD candidate at UTIAS.</p> <p>“In addition, we had a very modular, low-cost and sustainable construction, as well as a self-starting wind-turbine design thanks to our active pitch control system. We are currently creating a mobile app to control and monitor the wind turbine performance right from our mobile devices.â€</p> <p>UTWind is one of ¯r¶¹Íø Engineering’s newest design teams, co-founded in January 2021 by Bansal and UTIAS alumnus&nbsp;<strong>Ben Gibson</strong>.</p> <p>“I was a member of a similar wind turbine design team at the University of Manitoba, while Suraj had prior experience from his master’s research work in the U.S. to design extreme-scale wind turbines,†Gibson says.</p> <p>“We wanted to pass as much of that knowledge on as we could, while both having fun and pushing ourselves to the maximum. And so far, it has worked out great.â€</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Mon, 04 Jul 2022 15:24:48 +0000 Christopher.Sorensen 175486 at ¯r¶¹Íø's aUToronto team wins first competition of AutoDrive Challenge sequel /news/u-t-s-autoronto-team-wins-first-competition-autodrive-challenge-sequel <span class="field field--name-title field--type-string field--label-hidden">¯r¶¹Íø's aUToronto team wins first competition of AutoDrive Challenge sequel </span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/aUToronto2022_crop.jpg?h=afdc3185&amp;itok=HFjmF2oB 370w, /sites/default/files/styles/news_banner_740/public/aUToronto2022_crop.jpg?h=afdc3185&amp;itok=lX7IIV1K 740w, /sites/default/files/styles/news_banner_1110/public/aUToronto2022_crop.jpg?h=afdc3185&amp;itok=p6ubdIaM 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/aUToronto2022_crop.jpg?h=afdc3185&amp;itok=HFjmF2oB" alt="AutoDrive Challenge team"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-06-14T15:49:13-04:00" title="Tuesday, June 14, 2022 - 15:49" class="datetime">Tue, 06/14/2022 - 15:49</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">The aUToronto team, made up mostly of ¯r¶¹Íø undergraduate students, won the first phase of the AutoDrive Challenge II, which took place earlier this month in Ann Arbor, Mich. (photo courtesy aUToronto)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/tyler-irving" hreflang="en">Tyler Irving</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/graduate-students" hreflang="en">Graduate Students</a></div> <div class="field__item"><a href="/news/tags/self-driving-cars" hreflang="en">Self-Driving Cars</a></div> <div class="field__item"><a href="/news/tags/undergraduate-students" hreflang="en">Undergraduate Students</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><a href="https://www.autodrive.utoronto.ca/">A self-driving vehicle team</a> from the ¯r¶¹Íø’s Faculty of Applied Science &amp; Engineering has taken the top spot overall in the first competition of the four-year AutoDrive Challenge&nbsp;II.</p> <p>The achievement by aUToronto <a href="https://news.engineering.utoronto.ca/autodrive-challenge-u-of-t-engineering-places-first-for-the-fourth-straight-year/">continues an impressive winning streak for the team</a>, which&nbsp;consistently placed first throughout the original four-year AutoDrive Challenge.</p> <p>In the current contest, the intercollegiate competition’s original concept has been expanded with more teams and more sophisticated tasks as participants&nbsp;develop and demonstrate an autonomous vehicle (AV) that can navigate urban driving courses.</p> <p>“This year was a fresh new start for us,†says&nbsp;<strong>Frank (Chude) Qian</strong>, a master’s candidate at the ¯r¶¹Íø Institute for Aerospace Studies (UTIAS) and&nbsp;team principal for aUToronto. “We have a very young team, with nearly 90 per cent&nbsp;of the students new to the competition.â€</p> <p>Approximately 85 per cent&nbsp;of these students are undergraduates from across ¯r¶¹Íø Engineering’s departments and divisions. The remainder are graduate students or undergraduates from other parts of ¯r¶¹Íø, including the department of computer science in the Faculty of Arts &amp; Science.</p> <p><img alt src="/sites/default/files/PerceptionCart-crop.jpg" style="width: 750px; height: 500px;"></p> <p><em>Throughout the fall of 2021 and winter of 2022, the aUToronto team spent hours designing, training and testing their perception cart, pictured here at the ¯r¶¹Íø Institute for Aerospace Studies. (photo courtesy of&nbsp;aUToronto)</em></p> <p>A total of 10 institutions from across North America sent teams to AutoDrive Challenge&nbsp;II. They assembled earlier this month at Mcity in Ann Arbor, Mich., a unique purpose-built proving ground for testing the performance and safety of connected and automated vehicles.</p> <p>“This is the first year of the second round of the AutoDrive competition, so the team was required to design, build and code everything from scratch,†says&nbsp;<strong>Steven Waslander</strong>,&nbsp;an associate professor at UTIAS who advises the team along with fellow faculty members <strong>Tim Barfoot</strong>,&nbsp;<strong>Jonathan Kelly</strong>&nbsp;and&nbsp;<strong>Angela Schoellig</strong>.</p> <p>“It was a monumental effort and one that shows the true depth of talent and dedication of all the members of this amazing group of students.â€</p> <p>In the first phase of the four-year competition, the teams were using what are known as perception carts.</p> <p>“We use these to validate the design of our perception system, which we will incorporate onto a real vehicle for next year’s competition,†says Qian. “Our brand-new sensor suite is based on a new solid-state LiDAR modality.â€</p> <p>LiDAR is a sensing technology that works in a similar way to radar, except that it uses laser light instead of sound. It is a key component of the sensor suite – which also includes traditional radar and visual cameras similar to those found in smartphones – that enables a self-driving vehicle to build up a 3D representation of its surroundings.</p> <p><img alt src="/sites/default/files/aUTorontoinAnnArbor-crop.jpg" style="width: 750px; height: 500px;"></p> <p><em>Members of the aUToronto team with their winning perception cart at MCity in Ann Arbor, Mich&nbsp;(photo courtesy of&nbsp;aUToronto)</em></p> <p>In addition to being declared the overall winner, the ¯r¶¹Íø Engineering team received top marks in a wide range of categories, including the concept design event, the traffic light challenge and mobility innovation.</p> <p>One of the most dramatic parts of the competition was the dynamic object detection challenge, during which the cart had to detect and avoid a mannequin of a deer.</p> <p>“In the final testing session, the team realized their segmentation code was failing due to a change to the deer mannequin being used,†says Waslander.</p> <p>“Having planned for just such an event, they immediately switched to high gear. They took and labelled over 2,000 images of the new deer, then spent the whole night training and tweaking a brand-new detector. It worked brilliantly in competition the next day, securing first place.â€</p> <p>Qian says he is very proud of all that the team has accomplished.</p> <p>“This year with so many newer members, training became very important,†he says. “We also lost some precious development time due to challenges associated with COVID-19. I cannot give enough credit to the aUToronto flight team members who really took one for the team and pulled through together.â€</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Tue, 14 Jun 2022 19:49:13 +0000 Christopher.Sorensen 175243 at Researchers design 'socially aware' robots that can anticipate – and safely avoid – people on the move /news/researchers-design-socially-aware-robots-can-anticipate-and-safely-avoid-people-move <span class="field field--name-title field--type-string field--label-hidden">Researchers design 'socially aware' robots that can anticipate – and safely avoid – people on the move</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/Hugues-Thomas-robotics-story-weblead.jpg?h=afdc3185&amp;itok=92aueC8y 370w, /sites/default/files/styles/news_banner_740/public/Hugues-Thomas-robotics-story-weblead.jpg?h=afdc3185&amp;itok=POt2dsrM 740w, /sites/default/files/styles/news_banner_1110/public/Hugues-Thomas-robotics-story-weblead.jpg?h=afdc3185&amp;itok=weHgrGz7 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/Hugues-Thomas-robotics-story-weblead.jpg?h=afdc3185&amp;itok=92aueC8y" alt="&quot;&quot;"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-05-17T12:54:39-04:00" title="Tuesday, May 17, 2022 - 12:54" class="datetime">Tue, 05/17/2022 - 12:54</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">Hugues Thomas and his collaborators at the ¯r¶¹Íø Institute for Aerospace Studies created a new method for robot navigation based on self-supervised deep learning (photo by Safa Jinje)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/taxonomy/term/6738" hreflang="en">Safa Jinje</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/machine-learning" hreflang="en">machine learning</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/robotics" hreflang="en">Robotics</a></div> <div class="field__item"><a href="/news/tags/utias" hreflang="en">UTIAS</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>A team of researchers led by ¯r¶¹Íø Professor&nbsp;<strong>Tim Barfoot&nbsp;</strong>is using a&nbsp;new strategy that allows robots to&nbsp;avoid colliding&nbsp;with people by predicting the future locations of dynamic obstacles in their path.&nbsp;</p> <p>The project, which is supported by&nbsp;Apple Machine Learning, will be presented at the International Conference on Robotics and Automation in Philadelphia at the end of May.</p> <p>The results from a simulation, which are not yet peer-reviewed,&nbsp;<a href="https://arxiv.org/abs/2108.10585">are available on the arXiv preprint service</a>.&nbsp;</p> <p>“The principle of our work is to have a robot predict what people are going to do in the immediate future,†says&nbsp;<strong>Hugues Thomas</strong>, a post-doctoral researcher in Barfoot’s lab at the ¯r¶¹Íø&nbsp;Institute for Aerospace Studies in Faculty of Applied Science &amp; Engineering. “This allows the robot to anticipate the movement of people it encounters rather than react once confronted with those obstacles.â€&nbsp;</p> <p>To decide where to move, the robot makes use of Spatiotemporal Occupancy Grid Maps (SOGM). These are 3D grid maps maintained in the robot’s processor, with each 2D grid cell containing predicted information about the activity in that space at a specific time.&nbsp;The robot choses its future actions by processing these maps through existing trajectory-planning algorithms.&nbsp;&nbsp;</p> <p>Another key tool used by the team is light detection and ranging (lidar), a remote sensing technology similar to radar&nbsp;except that it uses light instead of sound. Each ping&nbsp;of the lidar creates a point stored in the robot’s memory.&nbsp;Previous work by the team has focused on labeling these points based on their dynamic properties. This helps the robot recognize different types of objects within its surroundings.&nbsp;</p> <p>The team’s SOGM network is currently able to recognize four lidar point categories:&nbsp;the ground; permanent fixtures, such as walls; things that are moveable but motionless, such as chairs and tables; and dynamic obstacles, such as people. No human labelling of the data is needed.&nbsp;&nbsp;</p> <p>“With this work, we hope to enable robots to navigate through crowded indoor spaces in a more socially aware manner,†says Barfoot. “By predicting where people and other objects will go, we can plan paths that anticipate what dynamic elements will do.â€&nbsp;&nbsp;</p> <p>In the paper, the team reports successful results from the algorithm carried out in simulation. The next challenge is to show similar performance&nbsp;in real-world settings, where&nbsp;human actions can be difficult to predict. As part of this effort, the team has tested their design on the first floor of ¯r¶¹Íø’s Myhal Centre for Engineering Innovation &amp; Entrepreneurship, where the robot was able to move past busy students.&nbsp;&nbsp;</p> <p>“When we do experiment in simulation, we have agents that are encoded to a certain behaviour&nbsp;and they will go to a certain point by following the best trajectory to get there,†says Thomas. “But that’s not what people do in real life.â€&nbsp;</p> <p>&nbsp;</p> <div class="media_embed" height="422px" width="750px"><iframe allow="autoplay" height="422px" src="https://drive.google.com/file/d/1wbq3lVdHZbU_4WSIz7-ArQN-g9fah-gL/preview" width="750px"></iframe></div> <p>&nbsp;</p> <p>When people move through spaces, they may hurry or stop abruptly to talk to someone else or turn in a completely different direction. To deal with this kind of behaviour,&nbsp;the network employs a machine learning technique known as self-supervised learning.&nbsp;&nbsp;</p> <p>Self-supervised learning contrasts with other machine-learning techniques, such as reinforced learning, where the algorithm learns to perform a task by maximizing a notion of reward in a trial-and-error manner. While this approach works well for some tasks – for example, a computer learning to play a game&nbsp;such as chess or Go – it is not ideal for this type of navigation.&nbsp;</p> <p>“With reinforcement learning, you create a black box that makes it difficult to understand the connection between the input – what the robot sees – and the output, or the robot does,†says Thomas. “It would also require the robot to fail many times before it learns the right calls, and we didn’t want our robot to learn by crashing into people.â€&nbsp;&nbsp;&nbsp;</p> <p>By contrast, self-supervised learning is simple and comprehensible, meaning that it’s easier to see how the robot is making its decisions. This approach is also point-centric rather than object-centric, which means the network has a closer interpretation of the raw sensor data, allowing for multimodal predictions.&nbsp;&nbsp;</p> <p>“Many traditional methods detect people as individual objects and create trajectories for them.&nbsp;But since our model is point-centric, our algorithm does not quantify people as individual objects, but recognizes areas where people should be. And if you have a larger group of people, the area gets bigger,†says Thomas.&nbsp;&nbsp;&nbsp;</p> <p>“This research offers a promising direction that&nbsp;could have positive implications in areas such as autonomous driving and robot delivery, where an environment is not entirely predictable.â€&nbsp;&nbsp;</p> <p>In the future, the team wants to see if they can scale up their network to learn more subtle cues from dynamic elements in a scene.&nbsp;</p> <p>“This will take a lot more training data,†says Barfoot. “But it should be possible because we’ve set ourselves up to generate the data in more automatic way: where the robot can gather more data itself while navigating, train better predictive models when not in operation&nbsp;and then use these the next time it navigates a space.â€&nbsp;&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Tue, 17 May 2022 16:54:39 +0000 Christopher.Sorensen 174762 at