2009年6月30日 星期二

好康又來啦! LabVIEW 學生訓練特惠方案

傳一下我最近收到的訊息,如果有學生們想要趁這個機會學 LabVIEW,這是個很棒的機會,平常NI的課程都上萬起跳,好好把握機會!

訊息如下﹕

Dear All

雖然是學生放假的炎炎夏日
但是我們七月和八月的學生版兩天付費課程(NTD  6000)依然開班悠!
2009/7/23(星期三)~7/24(星期四)
2009/8/25(星期二)~8/26(星期三)

兩天付費課程+學生版LV軟體+圖控書籍 = NTD 6000

http://digital.ni.com/worldwide/taiwan.nsf/web/all/70BA49C9F31106DE482574D600200929

報名專線:NI 張小姐 02-23772222 x7188

2009 FIRST Events in ISRAEL

 

在美國沸沸揚揚的FIRST FRC機器人競賽,現在也開始跨到其他國家了。以下是以色列舉辦FIRST機器人競賽的相關報導,說不定未來在台灣我們也可以看到類似的CompactRIO機器人競賽!


2009 FIRST events in ISRAEL were held in the main basketball arena in Tel-Aviv for both FIRST LEGO League (FLL) and FIRST Robotics Competition (FRC).
The event was very successful and involved kids from all regions in ISRAEL, coming both from cities and small towns/villages where technology is high on their list of wants too. It was rewarding to see the integration of kids as new immigrants with those who were born in ISRAEL, integration of different languages spoken and more.
Following our past experience with FIRST and the big move of using cRIO as the main platform, NI ISRAEL dedicated many resources to help this event. Our team, led by Eran CASTIEL and the Applications Engineers group, made a big drive and put in extra hours to ensure the success of these kids facing high tech issues for the first time in their lives and also the success of the event itself. The team’s skills and commitment were appreciated both by the organizers and also by the individual teams (and parents too.)
During the past year, NI ISRAEL has developed a close relationship with the local branch of FIRST in ISRAEL, based on monthly meetings and activity planning. Part of the activities was a collaboration of PTC and NI, based on training and support. NI ISRAEL has also mentored three teams and helped local schools with limited resources by hosting hands-on activities.
NI ISRAEL also trained more than 200 people including students, teachers, engineers, and all academic channels up to Professors. The training agenda was based on LabVIEW materials and on CompactRIO concepts.

In conjunction with the FIRST competition, NI ISRAEL was able to build a close relationship with the Robotic center in the north of ISRAEL, located in the best academic technological institute in ISRAEL.

FIRST Robotics Competition Participation

Competition began in 2005 with 12 teams and has increased to 47 teams in 2009 including 1,500 high school students, 45 corporate sponsors. Of the 47 teams at the FRC, 90 percent were using LabVIEW as their programming language. The Israeli president, Shimon PERES made an official visit for the opening of the event and also visited the individual teams in both FRC and FLL.

Mr. PERES shared his thoughts and vision on the importance of hi-tech education.

NI ISRAEL had the honor of giving medal prizes at the Israeli FLL Finals

For the best group of control and automation and inspiration.

What's next? Worldwide 2009 – 2010

FLL:

FLL ISRAEL 2010 Goals

100 Teams

1000 middle-school-age students

Four Regional Competitions

One Championship Competition

FRC:

2010 FIRST Robotics Competition Participation Goals

Worldwide: 1800 Teams - 38,000 High School Students

75,000 volunteers

ISRAEL: 50 Teams - 1700 High School Students

200 volunteers +

2009年6月24日 星期三

大成功: CompactRIO 機器人自走車展示!

終於到了學期末了,努力了一學期,昨晚剛好是我們的成果發表會,在此和大家分享一下。 雖然說學員們真正能夠做這個 project 的時間才差不多一個月半,但是他們的成就真的是讓人刮目相看 … 真是對的人遇上了對的工具。

需要復習一下 project 規則嗎? 請看這兒:

http://riobotics.blogspot.com/2009/06/unmanned-guided-vehicle.html

首先,各組先做簡報,來分享他們解決這次問題的方法與挑戰:

士強 1

(帥啊,士強!)

做完簡報之後,就到了實際上戰場的時候啦! 各隊這時候就趕緊做最後的參數調整。

Practicing 2 Practicing 1

這裡是最成功的一組的成果,perfect run!

為感謝這次學員的努力,NI也派工程師來和老師一起頒發榮譽獎狀。

certificate

CompactRIO,YA!

group 1

雖然是demo完了,但還是對 CompactRIO 機器人依依不捨,再拍張照吧!

Robot 1

做完問卷調查之後,發現學員們的反應有幾個共同點:

1. 透過這次課程還有project,有讓大家更了解機器人系統整合的概念! 之前大家或許是有寫過視覺演算法或摸過一些馬達,但是很少能夠有機會能夠體驗整個系統是可以如何整合在一起的。

2. LabVIEW是個容易上手的程式語言,但是程式變的比較複雜的時候,圖形化的debugging方式還有trace會變得比較吃力。

3. 硬體週邊(馬達,感測器,camera)需要更完整整合,希望可以提供其全的配備,以及設定上的說明。

我想,下次這個課程開的時候,應該可以有更多時間可以講一些LabVIEW的中級概念,好讓學員能夠有效率的維護程式並進行除錯,不然一頭栽進LabVIEW的話,很容易會被自己拉的像蜘蛛網般的線絆住。另外,之後也會提供更完整的 robot kit,讓學員可以很快的從這個平臺上手。

I would like to thank all of the students for their hard work and motivation.  Without your passion in robotics, this class would not be successful.  Keep up the good work!

2009年6月22日 星期一

產業訊息: 機器人科專建構與推動服務型機器人產業

 

(發佈日期)2008/09/02

經濟部技術處有鑑於機器人潛在的龐大市場,為建構智慧機器人產業發展環境、加速整體佈局,以因應機器人世代的來臨,2005年行政院SRB會議結論,將智慧型機器人列為重大政策發展計畫。

經濟部技術處支持工研院機械所投入服務型機器人研發已近三年,並與其院內資通所、能環所、創意中心等單位共同合作,希望能協助國內業者開創引領新生活之服務型機器人,進入家用陪伴、安全看護與娛樂的服務市場。為降低機器人研發初期之重複投資,目前正進行機器人共同軟硬體平台開發,並以保全與生活伴侶機器人為應用載具,提供予業界、學界作為發展各式應用機器人之基礎,以加速機器人產業建立。

工研院機械所在技術處科專的持續支持下,現有服務型機器人發展鎖定保全(Security)與伴侶(Companion)兩大主軸,其中保全/導覽機器人已於高雄科工館展覽及示範運行,同時並與國內保全業龍頭廠商中興保全、台灣新光保全合作,透過技術/專利授權,協助保全業者由關鍵核心模組技術研究應用為起點,研發分散式保全機器人技術,並與多重環境感測系統整合,建立群體多工協調的機器人保全服務系統,完成國內首創之『無所不在的機器人保全系統』,結合所開發之全球首部自動更換電池系統,機器人可以24小時持續工作;伴侶機器人的代表應用產品-即時通訊玩具機器人IM-Bot,首創結合超過10億人使用的即時通訊網路科技(如MSN,Skype)與智慧互動機器人技術,以生動的機器人表情肢體動作,取代原本平面單調的文字符號,將遠端傳訊者傳送的即時訊息實體呈現,完整表達傳訊者的情感,今年參加2008美國匹茲堡發明展榮獲娛樂類(Entertainment)銀牌獎以及產品特別獎,期望能成為引領全球風潮之網通機器人。

各國政府積極發展機器人產業,南韓政府宣佈到2013年,每個家庭都會有一台機器人,而日本機器人協會預測到2025年,機器人產業每年全球的產值將超過五百億美元,我國政府已通過98年行政院優先推動計畫-高反應能力機器人技術研發三年計畫,整合工研院、精機中心及中科院等法人單位,結合跨領域科技開發服務型機器人(包含家用、娛樂人及產業用機器人),已鎖定定位、伺服、影音與感測等關鍵模組,建立我國機器人產品與模組自主能力,而政府也預計在五年內投入20億元經費,透過產業科技專案、主導性新產品開發計畫等措施,協助業界推動服務型機器人產品與產業發展,期能讓台灣於2014年到2020年間成為全球服務型機器人主要製造國,年產值達2,500億元,未來將可帶動國內業者投入下一波高科技產業,開創投資新機會,推動機器人產業發展,讓台灣成為全球機器人研究發展重鎮。

原始聯結

會泡咖啡的機器人

雖然我不喝咖啡,但是這個機器人讓我突然想來一杯 …

2009年6月19日 星期五

台灣機器人藝術家

圖:技術人員昨日在交易會上為仿真人臉的智能機器人扮靚

在今天開幕的第七屆中國?海峽項目成果交易會上,兩個來自台灣的「藝術家」憑藉著他們出色的才藝吸引了在場許多人的眼球,他們是由台灣科技大學展出的娛樂智能機器人Jenet和Bica。

仿真人臉具才藝會跳舞

「大家好,我叫Jenet,來自『國立』台灣科技大學,現在請讓我為大家唱首歌,……」Jenet熱情地與圍觀的民眾打著招呼,從她說話時臉上的表情,幾乎看不出她是一部機器人。據她的研發計劃主持人、台灣科技大學機械工程系教授林其禹介紹,Jenet是由台灣科技大學研發的台灣第一部大型雙足人形機器人,也是世界第一部具娛樂功能的大型仿真人臉雙足人形機器人。由於她的臉皮為參考真實人類臉部機能及特徵,以矽膠製作出來的,不但可以模仿真人臉部說話動作,而且她的雙眼內裝有攝像頭,只要站在她的面前,她可以惟妙惟肖地模仿出包括驚訝、生氣、高興在內的多種人類情緒表情。此外,Jenet還是一位具備人工智慧的多才多藝的藝術家,不但可以通過閱讀樂譜,立即唱出該樂譜之歌曲,還能通過預錄表演舞蹈。

與活潑的Jenet不同,Bica則是一位沉默寡言的大畫家,只要觀眾站在他的面前,他眼中的攝像頭會自動拍攝人臉,經過電腦分析之後,只需要短短幾分鐘時間,就可以為其畫出一張面部素描。林其禹告訴記者,這兩部娛樂機器人是台灣科技大學花了三年的時間研發成功的,是他們學校機器人劇場的「台柱」,此前在台灣舉行的一場「公演」,受到了廣泛的好評。

兩萬元買回家陪你下棋

林其禹表示,本次是他第二年參加海峽項目成果交易會。去年他首次來的時候帶著他們研發的,能夠識別圖案、與主人簡單對話並陪主人下棋打牌的家用型多功能教育娛樂機器人,通過交易會平台,他們目前已經與福建冠翔電子簽約合作進行該款機器人的生產商業化,今年下半年就將在大陸上市銷售,初步定價是人民幣兩萬元。而他這次帶著Jenet和Bica參會,也是希望能夠通過這次的展會找到大陸的企業共同研發生產,讓機器人劇院也走進大陸民眾的生活。

【本報福州十八日電】

原始聯結:

http://www.takungpao.com/news/09/06/19/TM-1099891.htm

2009年6月18日 星期四

用念力來控制輪椅

看完了像「X戰警」這種電影,大家都會開始幻想自己有超能力。不必再幻想啦,自己開始當「X教授」吧...

1. 拿個NI CompactRIO
2. 寫 LabVIEW FPGA 程式。
3. 把 CompactRIO綁在輪椅上。
4. 坐上輪椅,用力想著「我要前進我要前進我要前進我要前進」

好啦,可能沒有這麼簡單,不過各位可以參考這些學生的作品,說不定你也可以用念力來控制飛機 ...

http://www.theaudeo.com

原始連結:
http://viroadshow.blogspot.com/2007/02/wheelchair-controlled-only-by-your.html


2009年6月14日 星期日

HSL 色彩空間原理

 

各位還記得上次我們在用 AXIS 206 測試的時候,我們用了一個神奇的視覺程式,先讓 camera 校準了一下,然後它就可以抓顏色了嗎?  在此解釋一下它的運作原理: HSL 色彩空間 (Hue, Saturation, Luminance.)

在這之前,我們先回顧一下,平常我們比較用 RGB 的色彩空間來描述一個圖片或畫面。電視,LCD螢幕,投影機都有用到RGB。不熟悉 RGB 原理的朋友們可以從這邊來復習一下:

維基百科: 三原色光模式

Wikipedia: RGB color model

用在於一些基本的顏色辨識應用上,RGB 通常是夠用的,因為你只要設定你要抓的R值,G值,還有B值範圍,基本上那個顏色區塊是可以被你隔離出來的。從下面的圖片我們可以看得到,這張圖片要是用RGB的 histogram 來看的話 (左下角),RGB各個區域有三根”刺”,這其實是因為在圖片裡灰色檔案櫃佔圖片面積居多,而灰色剛好又是RGB三色合成的。

image

我們把那三個刺隔離出來給大家看一下,有沒有發現被隔離出來的區域大部份都是櫃子?

image

當然,因為打光的關係,櫃子的某些部份是沒有被抓到的。在RGB色彩空間下做色彩辨識,很容易會被不平均的光源影響 (比較沒有打到光的部份,根本就已經幾乎是另外一種灰色了…

再試試看抓綠色燈罩的部份:

image

因為打光的關係,燈罩的右側反而是比較亮的。我們只能做個取捨,抓左側面積比較大的,比較暗的綠色。如果硬要亮暗綠色統統抓的話,那可能結果就會像這樣:

image

可憐啊,看來背景的一些”綠色”的東西也被抓進去了…

如果我們可以用另外一個方式,將亮度的影響降低,那或許效果會比較好一點。於是,現在就會用到HSL了。

HSL stands for HUE, SATURATION, and LUMINANCE color space that corresponds to projecting standard RGB color space. HSL separates out HUE from SATURATION and from brightness. Thus, the problem of luminance variation can be solved in this case since the LUMINANCE plane has been separated out.

因為HSL將亮度放在另外一個平面,而不是像RGB讓它跟顏色混在一起,看起來抓顏色應該會容易許多。重點是,當光源環境一直在改時(例如,機器人跑來跑去),追蹤能力會穩定許多! 請看以下:

image

以上都是用 NI Vision Assistant 的 Color Threshold 函式來完成,函式可以直接支援RGB或HSL色彩空間。

2009年6月9日 星期二

如何做一個客製化 I2C Sensor 的 CompactRIO 模組 (簡敘)

最近越來越多機器人都在用 CompactRIO 在做開發,而其實現在也有很多小型,低耗電的 sensor 都是用 I2C 的通訊方式在溝通。真可惜 NI 沒有在 CompactRIO 上做個現成的 I2C 界面啊!  不過還好因為 I2C 是個序列的通訊協定,只需要兩個訊號來做溝通 (Clock & Data),一般來講我們可以用 CompactRIO 的一些現成 DIO module,將兩條 digital line 定義為 I2C 的 clock 和 data 訊號,再用 LabVIEW FPGA 寫 I2C 的編碼解碼邏輯就可以了。

什麼?! 你沒有寫過 LabVIEW FPGA? 還好 NI 在網路上有些現成的 I2C 範例,可以讓我們直接放在 FPGA 上。請參考以下聯結:

I2C Implementation in LabVIEW FPGA

I2C Bus Communication Example Using LabVIEW FPGA

所以到時候這個架構會長的像這樣:

image

如果我們想把 I2C Sensor 直接做進一個 CompactRIO 的模組的話,這樣也可以。好處是,模組的殼能夠保護 sensor,而且未來如果其他人有同款 I2C sensor 的需求的話,或許這個模組可以商品化,幫他人節省時間和麻煩。

Clock & Data 這兩條訊號也可以從 CompactRIO 背板上直接定義,這樣一來 sensor 就可以 lay 在一個電路板上,然後放在一個空殼裡面。架構如下:

image

剛好最近 NI 也有個 CompactRIO 模組設計競賽,有興趣的朋友們要趕快報名參加哦!

NI 產品設計創業競賽

2009年6月8日 星期一

機器人障礙迴避 - 實作影片

 

不好意思,手機拍的影片很爛,我會找時間再拍個比較好的影片,不過先給大家看一下用 URG 的實作範例。現在機器人是完全自主在跑哦!

Calculating the angle of the nearest obstacle from the Hokuyo rangefinder

Now that you can obtain the range magnitude and angle arrays from the Hokuyo LabVIEW VI, here’s a little subVI to help identify where the nearest obstacle is.  Keep in mind, 0 degrees is straight ahead, positive angles are towards the left side of the sensor, and negative angles are towards the right side of the sensor.  You can add it to the Hokuyo VI like this:

image LIDAR Max and Min.vi

使用FPGA來產生PWM信號,為什麼0% duty cycle還是會有一段短的時間會有pulse?

 

又到了 LabVIEW FPGA Q&A 時間,這個現象我以前也會碰到,因為我拿 LabVIEW 的 PWM 範例直接下去 compile,但是 NI 給的 FPGA PWM 範例其實是可以再改善的。這是 Example Finder 所提供的範例:

image

這時候如果 duty cycle 是 0% 或 100% 的話, low pulse 或 high pulse 的值很有可能是 0,但是將 "0" feed 給 Loop Timer 的話,它很難真正給你停 0 秒,因為這個指令已經被下了,說什麼也要花幾個 tick 的時間。如果可以想辦法改一下的話,讓low pulse 或 high pulse等於0的時候直接送出 0V 或 5V,就不會有那個現象了。這裡有個範例給大家看看:

Example of modified FPGA PWM

還有,這邊再附上一個很方便的VI,讓我們從RT那邊幫我們算 high & low ticks 應該要多少,只要跟它講你要的 duty cycle以及PWM的頻率就好了。注意,此VI會假設FPGA那邊是用預設的40Mhz的timebase,假如你用其他的timebase,那可能這個VI還要在修改一下。用途如下:

image

Calculate PWM High and Low Ticks.vi

2009年6月7日 星期日

AXIS 206 Camera 測試

 

這個週末剛好想要測試一下這個AXIS 206相機到底可以辨識顏色到什麼程度,哇真頭痛,該拿什麼道具來測呢? 突然靈機一動,想到家裡有些IKEA的臺燈,剛好可以派上用場,於是禮拜六就殺去了IKEA多買了幾個燈罩,然後再整合了一些之前寫的LabVIEW範例程式。

SKIMRA Shade

現在可以做到的是,Red Green Blue都可以正確辨識,當然事先要先讓程式calibrate一下,它才能知道顏色的HSL參數範圍在哪兒。不懂HSL原理沒關係,我之後會再po一些相關資料。重點是,我們已經可以抓到顏色以及它佔的大小,甚至這個物件的輪廓和中心點 (哈,中心點之後會再加進去LabVIEW程式裡,不過大致上不會很難。)

Red:

image

Green:

image

Blue:

image

請下載這個壓縮檔,然後開“LocateColoredObjectExample_AXIS.vi” 試試看。先選擇你要鎖定的顏色 (red, green, or blue),然後再按一下”Calibrate” 按鈕,你就可以用這個程式做物件追蹤了。

LocateColoredObjectExample_AXIS.rar

2009年6月5日 星期五

誰會用LabVIEW設計一個會踢足球的機器人? 我會!

大家來看看這個youtube影片 -- 這些學生真的很神奇,利用LabVIEW來幫他們設計一個會走路,會踢足球的機器人! 如果你也想用LabVIEW做個機器人的話,也可以參考他們的步驟...

1. 先試著從LabVIEW經由RS-485來控制機器人的馬達。這些學生先從簡單的open-loop馬達開始 (無回授),而結果就是 ... 可憐的 Darwin 0。不過至少他們證實了LabVIEW是可行的。

2. 利用LabVIEW的環境來模擬機器人的運動學方程式 (kinematic equations.) 用軟體模擬幫他們節省了不少時間,而不久之後就作出了 可以走路的Darwin 1。

3. 加入機器人的人工智慧。它的「頭腦」用的是一個PC 104,Pentium M 1.4G 的處理器 (上面跑LabVIEW RT),而「眼睛」則用的是一個工業等級1394相機。他們不但用LabVIEW寫好了機器人動作的控制法則,還用LabVIEW寫好了視覺辨識的邏輯。全部結合在一起,就有了會踢足球的Darwin 2。


CompactRIO 機器人移動平臺

這是我們之前跟北科大車輛系蕭耀榮老師配合的合作計劃,CompactRIO 在此能夠整合無線網路,PDA遙控,甚至Wii遙桿來控制馬達平臺。來看看哦!

2009年6月4日 星期四

Interfacing to the AXIS 206 Camera (part III)

既然我們已經從PC上跟Axis Camera連線成功了,現在我們可以試著將AXIS 206連到 cRIO-9074 的第二個網路孔 (記得哦,是第二個,第一個孔是讓 CompactRIO 接到電腦的。)

首先我們先來做些設定。從MAX裡找出你的CompactRIO,選擇 “Advanced Ethernet Settings”選項。

image

在左邊選擇第二個 ethernet port 的 MAC address,然後將 mode 從 “Disabled” 調成 “TCP/IP Network”. IP address 可以設成跟你 AXIS 206 的 IP 地址同樣 subnet。按ok之後,CompactRIO會需要重新開機。

image

接下來,我們就可以用之前 Part I 的程式來從 cRIO 跟 AXIS 206 Camera 溝通了!  Remember to move the VI from the “My Computer” level in the LabVIEW Project to the cRIO level in the LabVIEW Project.

 image

2009年6月3日 星期三

Interfacing to the AXIS 206 Camera (part II)

 

注意嘍,CompactRIO 的原廠設定是不支援 NI Vision 函數的,不過我們可以 Hack 一下 CompactRIO 來克服這個問題。

There are two install scripts located under these locations:

C:\Program Files\National Instruments\RT Images\Vision\8.6.0\install.cdf

C:\Program Files\National Instruments\RT Images\CommonVision\nivissvc.cdf

Replace with these files:

CDF files to replace

Note: These CDF files are for LabVIEW 8.6.0 only.

Once the scripts have been correctly updated, relaunch MAX and download Vision to your cRIO target.  Right click on Software and select Add/Remove Software.

image

Select Custom Software Installation.

image

Highlight NI Vision RT and select “Install the feature.” Click Next. 

image

Reboot and you’re done!

Interfacing to the AXIS 206 Camera (Part I)

 

The AXIS 206 is an ethernet-based camera that supports up to 640x480 resolution.  The camera has a built-in web server that allows you to monitor the live images from a web browser.  Just type in http://x.x.x.x where x.x.x.x is the IP address of your camera.

The camera might ask you for a user name and password when you connect to the page, usually the user name is “root” and the password is “pass.”  Find the option in the settings menu to switch on “anonymous login” so that this prompt won’t show anymore.  This way we can use a program from LabVIEW to control the camera without being blocked by the password.

The VI you want to run is called “axis_03_grab.vi”.  You’ll need to also have the NI Vision Development Module installed before you run it.

Don’t run this VI on the CompactRIO yet, because I haven’t told you how to modify the CompactRIO so that it can run the vision algorithms.  You can run this VI on the PC first to experiment with it.  Remember to change to IP address so that it matches your camera!  When connecting from a PC without a hub or router, make sure to use a crossover cable, otherwise you might have some problems.

image

AXIS 206 Vision VIs download

Also, if you have trouble finding CalculateFPS.vi, here it is for download.

CalculateFPS.vi download

CompactRIO Scan Mode Tutorials

Here are some tutorials from the NI website to show you how to use CompactRIO Scan Mode.  You’ll probably want to use Scan Mode to output an analog voltage from the NI 9263 analog output module to control your motors.

English tutorials

Chinese Tutorials

2009年6月2日 星期二

LabVIEW 連到 Hokuyo URG-04LX 雷射測距儀

還蠻簡單的,just follow the instructions below:

1. Download the Hokuyo URG-04LX instrument driver from ni.com/idnet (search for Hokuyo.)

http://www.ni.com/devzone/idnet/

2. Install under Program Files\National Instruments\Labview X.X\instr.lib

3. Plug in the USB port from the URG to the PC.

4. Find out which COM Port to use to communicate to the URG from Device Manager

5. Open the LabVIEW example, and select the appropriate COM Port.

 

如果你的URG是新的話,你可能要用他的軟體來把SCIP從1.1設成2.0. Here is the configuration tool, you can also use it to set the baud rate for the RS-232 interface (用 USB 連線的話,baud rate 的設定就無所謂了。)

URG Configuration Tool

Have fun!

-John

期末專題: Unmanned Guided Vehicle

這是我為台大羅老師的 Robotics Sensing and Control 課程所設計的期末專題,有興趣的朋友們也可以玩玩看。

921 U8930: Robot Sensing and Control

Final Project: Unmanned Guided Vehicle (UGV)

Objective:

To design a robot that can navigate through an obstacle course and deliver a payload to the designated target location.

Background Information:

Unmanned Guided Vehicles (UGVs) are used to perform routine tasks for industry, as well as being used in areas hazardous to humans. Machine vision can provide such vehicles with 'sight', allowing them to understand their surroundings and leading to more flexible use of UGVs. Other sensors may be used to collect more data from the surroundings, and data from multiple sensors may be “fused” and combined to derive more complete data of the environment. Such sensors may include LIDAR sensors, IMU (Inertial Measurement Units), ultrasonic sensors, and GPS sensors. Real-world use of UGVs include military applications, terrain exploration, automated “driver” or driving assistance, and consumer applications (robot “helpers” or “maids”.)

 

image image image

Fig.1: UGVs in military, automotive, and consumer use

For this project, you will be using the NI CompactRIO system to help you integrate various sensors, vision systems, and motion control. By understanding how NI LabVIEW works as a graphical programming language, you will be able to assemble the critical components of a robotics system.

Rules of the Challenge:

image 

1. Robots start at the “Start Point”. There will be 6 red cones scattered randomly about the area. The field is surrounded by a wall or fence. The waypoints will have brightly colored flags or markers for indication.

2. Robot has to navigate to the first waypoint to “pickup” payload. For the purposes of this challenge, the robot only has to physically touch the waypoint and wait for 5 seconds.

3. Robot has to navigate to the second waypoint to “deliver” payload. Again, the robot only has to physically touch the waypoint and wait for 5 seconds.

4. Robot has to navigate back to the “Start Point” and standby for further instructions.

5. Each group gets 3 runs in the obstacle course. We will take the run with the shortest time as your best run. The team that has the shortest course time will receive 5 bonus points to your overall score.

Scoring scheme:

Robot reached Waypoint 1: 10 points

Robot reached Waypoint 2: 10 points

Robot returned to Start Point: 10 points

Bonus Points: 5 points (awarded to the team with the shortest time)

Penalties:

1. If the robot physically touches a cone, 2 points will be deducted.

2. If the robot physically touches the wall or fence, 5 points will be deducted.

3. If the robot needs to be reset, no penalty will be taken, but the timer will not stop. You must return your robot to the Start Point to try again.

4. Except for returning the robot to the Start Point, you may not interfere with the robot while it is navigating the obstacle course (i.e. nudging, tilting, or pushing the robot.) The robot must be able to finish the course autonomously to receive full points. Interfering with the robot will disqualify the current run, and the course time will not be considered valid.

Materials/equipment provided to you:

l NI CompactRIO

l NI LabVIEW

l NTU Robot platform with 24V battery

l Hokuyo URG-04LX Scanning Laser Rangefinder

l AXIS 206 camera

l Wireless Router

l 24V DC to 5V DC Converter

Project Guidelines:

There are two main challenges involved in completing the obstacle course.

l How do I make the robot avoid the cones and stay within the field?

l How do I let the robot know where the waypoints are?

Of course, there are many ways to solve these problems. It will be up to you to use your creativity to implement such a solution. Here we will briefly investigate a few examples to help you get started.

Obstacle Avoidance:

Sensors such as ultrasonic sensors and infrared sensors may be used to tell a robot if an object is within vicinity of the sensor range. However, the accuracy of the sensors are limited, and unless they are used in an array configuration, most likely you will only be able to extract some vague information. LIDAR sensors, on the other hand, such as the Hokuyo URG-04LX Scanning Rangefinder, will be able to feedback an accurate “map” of its surroundings within its scan range. By using such a sensor, it will be easy to detect object presence accurately, and the robot can use this information to decide whether to drive forward, turn, reverse, etc. LabVIEW has example programs that can interface directly with the Hokuyo URG-04LX.

image image

Fig 2. Hokuyo URG-04LX and resulting “map” of surroundings

Waypoint Detection:

Object detection and tracking is usually done with a vision system in robotics applications. Because machine vision is highly dependent on ambient lighting conditions, there are also many possible algorithms that can be applied to object tracking. One method of object tracking is through the use of color. If the color of an object has high contrast with its surroundings, then robots can use cameras to recognize this unique color and isolate its relative position using some further vision processing. Robots can also use geometry to match an object, or even a combination of both. The markers for the waypoints will be brightly colored objects, so your robots can also use geometry to match, or even a combination of both geometry and color. LabVIEW also has example programs to directly interface with the AXIS 206 IP Camera.

image image

Fig 3. AXIS 206 Camera and sample object tracking algorithm screenshot

Suggested Project Timeline

Week 1: Establish connection with CompactRIO and motors.

Week 2: Connect and test Hokuyo LIDAR Sensor.

Week 3: Connect and test AXIS 206 Camera. Test vision algorithm.

Week 4: Fine-tuning navigation algorithm.

Week 5: Final debugging, field trials.

Reminders:

l Always back up your programs!

l Before you start programming, always draw out a flowchart for your logic.

l Be careful of short circuits and live wires. Double check all wiring before powering on!

l When you’re not sure about what a VI in LabVIEW does, remember to use the online help and Example Finder.

l Start early! Leave plenty of buffer time for experimentation.

Be safe, be smart, and have fun.

John Wu

National Instruments Taiwan

wei-han.wu@ni.com

Link to Word document