Drone Swarms for Sensing-of-Sensing...be created, including wearable SWIMs, 2D plotter SWIMs (Fig....

4
Drone Swarms for Sensing-of-Sensing Steve Mann, Cayden Pierce, Jesse Hernandez, Qiushi Li, Bei Cong Zheng, Yi Xin Xiang MannLab Canada, 330 Dundas Street West, Toronto, Ontario, M5T 1G5 Surveillance camera Transmitter "Rabbit ears" receive antenna Wearable computer and phenomena.- lock-in amplier Light bulb brightens when in the Sighteld Light bulb Fig. 1. Mann’s metaveillance apparatus from the mid 1970s [15], [16], [17] used one or more light bulbs fed from a wearable computer and lock-in amplifier designed to lock in on television signals from surveillance cameras. The light bulb transitions from a dim red glow to a brilliant white whenever it enters the camera’s field-of-view, and then the bulb brightness drops off again when it exits the camera’s field of view. Waving it back and forth in a dark room reveals to the human eye, as well as to photographic film (picture at left) the camera’s capacity to “see”. Abstract—The use of drone swarms is proposed for the sensing- of-sensing (metaveillography and metaveillogrammetry) with ap- plications in surveillance audits, security audits, autonomous- vehicle sensory verification, and testing automotive sensors and automotive sensing. I. BACKGROUND AND RELATED WORK The sensing-of-sensing (also known as metasensing) is an area of increasing importance because people are now surrounded by a proliferation of sensors [1] [2] that they often know little about in regards to their efficacy, or agency e.g. whether the sensors are for surveillance [3], [4], [5], [6], [7], [8] , sousveillance [9], [10], [11], [12], [13], [14], or networked machine intelligence, which we call IoT 3 (Internet of Things- That-Think). The Greek word “meta” means “beyond”. For example, metadata is data about data. Likewise, metaveillance is the veillance of veillance (sensing sensors and sensing their ca- pacity to sense). See Fig. 1 to see the original invention that led to the study of metaveillance. Metaveillance was created as both visual art [18], [16], [17] and scientific exploration [15], [19]. II. THE CASE FOR DRONES Metaveillography is proposed as a scientific measurement to visualize the capacity of sensors to sense. Consequently, high spatial resolution and repeatability are mandatory criteria. There are various methods by which metaveillographs can be created, including wearable SWIMs, 2D plotter SWIMs (Fig. 2), robotic arms, and 3D plotter SWIMs [20]. These methods are effective in many circumstances, but there are situations when they fall short, such as when the sensors are Fig. 2. Photograph of a smart street light with a metaveillographic mechanism showing the streetlight’s built in camera and its capacity to sense. The physical phenomenon underlying the picture can be seen clearly in the form of the hysteresis of even and odd arcs. Compare this with Fig. 1 in difficult to access locations. This situation can arise when a sensor is placed high on a ceiling, in an area that pedestrians cannot access, or in high traffic areas where there is only a very limited amount of time to create a metaveillograph. Drones provide a significant advantage over other methods because of their ability to fly to high places and to transgress bounds (terrain, hostility, traffic, and fencing) that humans or ground-based robots cannot normally cross [21] [22]. The abil- ity to fly high is important, as most surveillance cameras are mounted in areas in which it would be difficult or impossible to setup a plotter or reach the sensor with a portable SWIM. Additionally, drones are versatile [23], as meta-sensing can be performed whilst flying over small areas (e.g. in a bathroom stall for sensing of sensors used in electronic toilet flushing) or very large areas (a foyer or cathedral where surveillance is being analyzed). Drones are also much cheaper than an industrial robotic manipulator. Both the versatility and much lower cost of the drones means that metaveillographs can now be created quickly, cheaply, and effectively. This extends the possibilities for hobbyists and professionals alike to explore metaveillance almost anywhere. Finally, in the past, most metaveillographs were created in 2D-space. Utilizing the drones allows research to continue into 3D-space. This is significant because imaging in 3D-space

Transcript of Drone Swarms for Sensing-of-Sensing...be created, including wearable SWIMs, 2D plotter SWIMs (Fig....

Page 1: Drone Swarms for Sensing-of-Sensing...be created, including wearable SWIMs, 2D plotter SWIMs (Fig. 2), robotic arms, and 3D plotter SWIMs [20]. These methods are effective in many

Drone Swarms for Sensing-of-SensingSteve Mann, Cayden Pierce, Jesse Hernandez, Qiushi Li, Bei Cong Zheng, Yi Xin Xiang

MannLab Canada, 330 Dundas Street West, Toronto, Ontario, M5T 1G5

Surveillance camera

Transmitter

"Rabbit ears"receive antenna

Wearablecomputer andphenomena.-lock-inamplifier

Light bulb

brightens when

in the Sightfield

Light bulb

Fig. 1. Mann’s metaveillance apparatus from the mid 1970s [15], [16], [17]used one or more light bulbs fed from a wearable computer and lock-inamplifier designed to lock in on television signals from surveillance cameras.The light bulb transitions from a dim red glow to a brilliant white wheneverit enters the camera’s field-of-view, and then the bulb brightness drops offagain when it exits the camera’s field of view. Waving it back and forth in adark room reveals to the human eye, as well as to photographic film (pictureat left) the camera’s capacity to “see”.

Abstract—The use of drone swarms is proposed for the sensing-of-sensing (metaveillography and metaveillogrammetry) with ap-plications in surveillance audits, security audits, autonomous-vehicle sensory verification, and testing automotive sensors andautomotive sensing.

I. BACKGROUND AND RELATED WORK

The sensing-of-sensing (also known as metasensing) isan area of increasing importance because people are nowsurrounded by a proliferation of sensors [1] [2] that they oftenknow little about in regards to their efficacy, or agency e.g.whether the sensors are for surveillance [3], [4], [5], [6], [7],[8] , sousveillance [9], [10], [11], [12], [13], [14], or networkedmachine intelligence, which we call IoT3 (Internet of Things-That-Think).

The Greek word “meta” means “beyond”. For example,metadata is data about data. Likewise, metaveillance is theveillance of veillance (sensing sensors and sensing their ca-pacity to sense). See Fig. 1 to see the original invention thatled to the study of metaveillance.

Metaveillance was created as both visual art [18], [16], [17]and scientific exploration [15], [19].

II. THE CASE FOR DRONES

Metaveillography is proposed as a scientific measurementto visualize the capacity of sensors to sense. Consequently,high spatial resolution and repeatability are mandatory criteria.There are various methods by which metaveillographs canbe created, including wearable SWIMs, 2D plotter SWIMs(Fig. 2), robotic arms, and 3D plotter SWIMs [20]. Thesemethods are effective in many circumstances, but there aresituations when they fall short, such as when the sensors are

Fig. 2. Photograph of a smart street light with a metaveillographic mechanismshowing the streetlight’s built in camera and its capacity to sense. The physicalphenomenon underlying the picture can be seen clearly in the form of thehysteresis of even and odd arcs. Compare this with Fig. 1

in difficult to access locations. This situation can arise when asensor is placed high on a ceiling, in an area that pedestrianscannot access, or in high traffic areas where there is only avery limited amount of time to create a metaveillograph.

Drones provide a significant advantage over other methodsbecause of their ability to fly to high places and to transgressbounds (terrain, hostility, traffic, and fencing) that humans orground-based robots cannot normally cross [21] [22]. The abil-ity to fly high is important, as most surveillance cameras aremounted in areas in which it would be difficult or impossibleto setup a plotter or reach the sensor with a portable SWIM.

Additionally, drones are versatile [23], as meta-sensing canbe performed whilst flying over small areas (e.g. in a bathroomstall for sensing of sensors used in electronic toilet flushing)or very large areas (a foyer or cathedral where surveillanceis being analyzed). Drones are also much cheaper than anindustrial robotic manipulator. Both the versatility and muchlower cost of the drones means that metaveillographs can nowbe created quickly, cheaply, and effectively. This extends thepossibilities for hobbyists and professionals alike to exploremetaveillance almost anywhere.

Finally, in the past, most metaveillographs were created in2D-space. Utilizing the drones allows research to continue into3D-space. This is significant because imaging in 3D-space

Page 2: Drone Swarms for Sensing-of-Sensing...be created, including wearable SWIMs, 2D plotter SWIMs (Fig. 2), robotic arms, and 3D plotter SWIMs [20]. These methods are effective in many

Fig. 3. Experimental setup utilizing drones, HTC Vive VRTM system, LEDs,and camera.

start 2 asynchronous threads

Live Camera Processing

Read buffered Framevalue

Print buffered Frame value

Read camera Frame

Convert Frame tograyscale

Count white pixels(lightPixelNum)

Save Frame value to buffer

ExitFlag Raised?

no

ExitFlag Raised?

yes

no

end

JoinThreads

Parametric Drone Control

Reset Kalman Estimator.Reset Parametric i=0

Read Frame valuefrom buffer

Set RGB LED & SetDrone position

i > 4800?

Raise ExitFlag

end

no

yes

start

Start Live CameraProcessing on

separate thread

G-Code Drone Control

Import GCodeCoordinates

Reset KalmanEstimator

Read next G-Codecoordinate & Frame

value from buffer

Set RGB LED & SetDrone position

End of GCode?

Raise ExitFlag

end

no

yes

start

Start Live CameraProcessing on

separate thread

Compute Frame value=sqrt(lightPixelNum)/100

/37*255^2

yes

Compute parametricequation

t += 0.063/4r = 0.15+(i/4800*0.3)

z = r*cos(t)+0.45y = r*sin(t)-0.13x += 0.003/16i++

Fig. 4. Flowcharts for the Live Camera Processing, Parametric Drone Control,and G-Code Drone Control routines

represents a measure of the true veillance field of a sensor [24].For those researching metaveillance, it also creates images thatprovide an intuitive understanding of the veillance field.

III. DRONE-BASED METAVEILLANCE

The goal of drone-based sensing-of-sensing is to investigatemetasensing in applications where 3D space is needed to fullyunderstand the sensor, such as the sensory systems of electric,autonomous vehicles [25], [26]. We also wish to investigatesensors that are installed in locations that are difficult to access

Fig. 5. Left: CAD Model depicting the setup for metaveillography of asurveillance camera. Right: Computerized 3D model of veillance flux.

Fig. 6. Left: Helical flight path metaveillograph of a dome surveillancecamera, placed on a stool for testing purposes. Right: Skeletonized metaveil-lograph of a surveillance camera, depicting the boundaries of the sensor’sveillance field in 3D space.

[27], [28], [29], thus veillance of ceiling mounted securitycameras is investigated.

To create a metaveillograph of a vehicle, a dark parkinglot was setup with all of the technology needed to create ametaveillograph. Fig 3 depicts this setup.

The Bitcraze CrazyflieTM 2.1 drones were chosen due totheir ability to fly in the dark, as most drones use opticalflow techniques that do not work in complete darkness. HTCViveTM Base Stations [30] are used to pinpoint the location(localization) of the drones in 3D-space. The Crazyflie’sTM

software and hardware are both open source as well, allowingus to make both hard and soft modifications to the system.

These drones were used to create metaveillographs in 3Dspace as they followed a flight path around the sensor undertest. On the drones was mounted a ring of RGB LEDs on thebottom, two forward-facing white headlight LEDs, and a singletop-mounted RGB LED. These LEDs were used in variousconfigurations as test lights and as metaveillographic lights.The test light provides a stimulus to be sensed by the sensor-under-test (e.g. camera), while the metaveillographic lightserves to show the metaveillance of the sensor-under-test. Themetaveillographic light is RGB, and its colour is a function ofthe sensing of the camera. When the test light is not sensedby the sensor-under-test, the metaveillographic light is red.When the test light is strongly sensed, the metaveillographiclight is blue. The frame captured from the testing camera isprocessed by first converting to gray-scale colour space, thenconverting to black-and-white by a threshold, and countingthe number of white pixels. The number of white pixels isthen passed into an activation function to determine if themetaveillographic light should be blue or red (or an RGBgradient between these colours). A long exposure photo isthen taken by an external camera. This captures many valuesof the metaveillographic light in 3D-space, constructing animage that reveals the veillance flux of the sensor that is beingsensed.

The drones’ flight is controlled from the command stationserver using radio communication (Bitcraze CrazyradioTM PAradio dongle device). Every 0.02 seconds, a new point in 3D-space is sent to a drone over the radio link. The drone travelsto this location and then waits for the next command.

Page 3: Drone Swarms for Sensing-of-Sensing...be created, including wearable SWIMs, 2D plotter SWIMs (Fig. 2), robotic arms, and 3D plotter SWIMs [20]. These methods are effective in many

Fig. 7. Metaveillograph of an electric, semi-autonomous vehicle.

Three different methods were developed in order to gener-ate drone flight paths. For some paths, parametric functionswere used to create cones, circles, etc. Another method wasfirst developing the flight paths in CAM (Computer AidedManufacturing) software. The tool path designed in CAMsoftware was then exported to G-Code, extracted the Cartesiancoordinates, rotated, and translated in order to fly a path aroundthe sensor. Flowcharts for these two processes are shown inFig 4. The final method was to generate the paths dynamically.The drones would sweep a large planar area under the camera,and the command station would mark whenever the drone fliesinto or out-of the field-of-view of the camera. These markingswould represent the boundary points of the camera’s field-of-view (Fig 5, Right). The boundary points would then be usedto compute the drones’ flight path by fitting four straight lineson the data points to form a rectangle and then calculating thefour intersection points at the four vertices of the rectangle.The drones would fly only at the boundaries of the camera’sfield-of-view (Fig 6, Right). The entire process would thenbe repeated at various distances from the camera to achieve a3D model. This final method was the most versatile becausea camera can be re-positioned anywhere within the area beingswept, without requiring any changes to the drone’s code.In contrast, moving the camera would require translating orrotating the parametric functions and G-Code accordingly.

IV. RESULTS AND DISCUSSION

Drones have been instrumental in creating 3D and 2Dmetaveillographs of autonomous vehicles, as demonstrated inFigs 7 and 8. As we approach a future of ubiquitous self-driving cars, the efficacy of the sensory systems of autonomousvehicles comes into question [31]. This is why we havedeveloped the method of imaging the sensory capabilities ofautonomous vehicles, so that end-consumers can know thattheir cars are safe [32], [33], [34].

Also, drones have been effective in achieving the goalof metaveillance in hard-to-access areas, see Fig 6. Thisdevelopment may be used to identify surveillance coverage

Fig. 8. Metaveillograph of a backup camera captured by the drones flyingarcs in a 2D plane.

leaks, giving insight into the safety level of security andrecording systems. A high-demand commercial application ofthis technology is quantifying the safety level of a locationaccording to the metaveillance flux density coverage. Thisquantitative measure of veillance flux can then serve as arequirement for insurance premiums.

An important aspect of our results is the fact that these meth-ods yield photos of veillance. Thus, this is a technique thatpreserves information as photographic evidence. Photographicevidence is necessary and holds great power in a court oflaw, where evidence is most always considered stronger thantestimony.

Drones have proven useful as a tool of metaveillance.However, there exist some drawbacks. The drones’ precisionoutdoors can sometimes be less than optimal, depending onfactors such as wind and solar IR radiation. The CrazyflieTM

2.1 drones’ motor mounts and expansion hats are fragile, butthis is somewhat offset by the ease of 3D-printing new partsand finding schematics through Bitcraze’sTM website. In thefuture, we aim to explore the creation of an Augmented orVirtual Reality visualization of any sensor’s veillance flux byutilizing the techniques we have developed of using drones tocreate high resolution computerized 3D models of a sensor’sveillance flux (See Fig 5).

V. CONCLUSION

A novel metaveillance technique (visualizing the veillancefield of sensors) has been implemented using drones. Thismethod provide consumers (car manufacturers, banks, malls,and anyone who relies on sensors) an intuitive understandingof their veillance efficacy, whether this be for surveillance,sousveillance, or autonomous and electronic systems.

ACKNOWLEDGEMENTS

The authors thank the many advisors, affiliates, and studentsof MannLab and MannLab Canada who helped with theproject, including Phillip, Mike, Humza, and Kyle.

Page 4: Drone Swarms for Sensing-of-Sensing...be created, including wearable SWIMs, 2D plotter SWIMs (Fig. 2), robotic arms, and 3D plotter SWIMs [20]. These methods are effective in many

REFERENCES

[1] C. E. Jimenez, F. Falcone, A. Solanas, H. Puyosa, S. Zoughbi, andF. Gonzalez, “Smart government: Opportunities and challenges in smartcities development,” in Civil and Environmental Engineering: Concepts,Methodologies, Tools, and Applications. IGI Global, 2016, pp. 1454–1472.

[2] F. Casino, P. Lopez-Iturri, E. Aguirre, L. Azpilicueta, A. Solanas, andF. Falcone, “Dense wireless sensor network design for the implemen-tation of smart health environments,” in 2015 International Conferenceon Electromagnetics in Advanced Applications (ICEAA). IEEE, 2015,pp. 752–754.

[3] D. Lyon, Surveillance Studies An Overview. Polity Press, 2007.[4] “Online etymology dictionary, douglas harper,” 2010. [Online].

Available: http://dictionary.reference.com/browse/surveillance[5] D. M. Wood, “The ‘surveillance society’: Questions of history, place and

culture,” European Journal of Criminology, vol. 6, no. 2, pp. 179–194,2009.

[6] T. Monahan, Surveillance and security: Technological politics and powerin everyday life. Routledge, 2006.

[7] C. Norris, M. McCahill, and D. Wood, “The growth of CCTV: aglobal perspective on the international diffusion of video surveillancein publicly accessible space,” Surveillance & Society, vol. 2, no. 2/3,2002.

[8] D. Lyon, Surveillance society. Open University Press Buckingham,2001.

[9] J.-G. Ganascia, “The generalized sousveillance society,” Soc. Sci.Info., vol. 49, no. 3, pp. 489–507, 2010. [Online]. Available:http://ssi.sagepub.com/content/49/3/489.abstract

[10] J. Bradwell and K. Michael., “Security workshop brings ‘sousveillance’under the microscope,” University of Wollongong: Latest News, 2012,http://media.uow.edu.au/news/UOW120478.html, accessed 2014.

[11] V. Bakir, Sousveillance, media and strategic political comm... Contin-uum, 2010.

[12] K. Michael and M. Michael, “Sousveillance and point of view technolo-gies in law enforcement: An overview,” 2012.

[13] M. G. Gordon Fletcher and M. Kutar, “A day in thedigital life: a preliminary sousveillance study,” SSRN,http://papers.ssrn.com/sol3/papers.cfm?abstract id=1923629,September 7, 2011.

[14] S. Mann, J. Nolan, and B. Wellman, “Sousveillance: Inventing andusing wearable computing devices for data collection in surveillanceenvironments.” Surveillance & Society, vol. 1, no. 3, pp. 331–355, 2003.

[15] S. Mann, “Phenomenal augmented reality: Advancing technology forthe future of humanity,” IEEE Consumer Electronics, pp. cover + 92–97, October 2015.

[16] ——, “Wavelets and chirplets: Time–frequency perspectives, with ap-plications,” in Advances in Machine Vision, Strategies and Applications,world scientific series in computer science - vol. 32 ed., P. Archibald,Ed. Singapore . New Jersey . London . Hong Kong: World Scientific,1992.

[17] “Steve mann,” Campus Canada, ISSN 0823-4531, p55 Feb-Mar 1985,pp58-59 Apr-May 1986, p72 Sep-Oct 1986, 1985.

[18] S. Mann, “Wearable technologies,” Night Gallery, 185 Richmond StreetWest, Toronto, Ontario, Canada, July 1985. Later exhibited at HamiltonArtists Inc, 1998.

[19] M. Asad, O. A. Aidaros, R. Beg, M. A. Dhahri, S. A. Neyadi, andM. Hussein, “Development of autonomous drone for gas sensing appli-cation,” in 2017 International Conference on Electrical and ComputingTechnologies and Applications (ICECTA), Nov 2017, pp. 1–6.

[20] S. Mann, “Phenomenological augmented reality with the sequentialwave imprinting machine (swim),” in 2018 IEEE Games, Entertainment,Media Conference (GEM), Aug 2018, pp. 1–9.

[21] S. Chen, Y. Shu, B. Yu, C. Liang, Z. Shi, and J. Chen, “Mobile wirelesscharging and sensing by drones,” in Proceedings of the 14th AnnualInternational Conference on Mobile Systems, Applications, and ServicesCompanion. ACM, 2016, pp. 99–99.

[22] F. Flammini, C. Pragliola, and G. Smarra, “Railway infrastructuremonitoring by drones,” in 2016 International Conference on ElectricalSystems for Aircraft, Railway, Ship Propulsion and Road Vehicles &International Transportation Electrification Conference (ESARS-ITEC).IEEE, 2016, pp. 1–6.

[23] I. Deaconu and A. Voinescu, “Mobile gateway for wireless sensornetworks utilizing drones,” in 2014 RoEduNet Conference 13th Edi-tion: Networking in Education and Research Joint Event RENAM 8thConference. IEEE, 2014, pp. 1–5.

[24] R. Janzen and S. Mann, “Sensory flux from the eye: Biological sensing-of-sensing (veillametrics) for 3d augmented-reality environments,” inIEEE GEM 2015, 2015, pp. 1–9.

[25] W. J. Fleming, “Overview of automotive sensors,” IEEE Sensors Journal,vol. 1, no. 4, pp. 296–308, Dec 2001.

[26] J. Kocic, N. Jovicic, and V. Drndarevic, “Sensors and sensor fusionin autonomous vehicles,” in 2018 26th Telecommunications Forum(TELFOR), Nov 2018, pp. 420–425.

[27] P. D. Mitcheson, D. Boyle, G. Kkelis, D. Yates, J. A. Saenz, S. Aldhaher,and E. Yeatman, “Energy-autonomous sensing systems using drones,” in2017 IEEE SENSORS, Oct 2017, pp. 1–3.

[28] Z. O. Zaw, V. P. Bui, and C. E. Png, “Efficient wireless link modelingfor marine drone application under harsh offshore environment,” in2017 XXXIInd General Assembly and Scientific Symposium of theInternational Union of Radio Science (URSI GASS), Aug 2017, pp. 1–2.

[29] N. Saadat and M. M. M. Sharif, “Application framework for forestsurveillance and data acquisition using unmanned aerial vehicle sys-tem,” in 2017 International Conference on Engineering Technology andTechnopreneurship (ICE2T), Sep. 2017, pp. 1–6.

[30] P. Dempsey, “The teardown: Htc vive vr headset,” Engineering &Technology, vol. 11, no. 7-8, pp. 80–81, 2016.

[31] D. F. Wilkins, “Optical communications and obstacle sensing for au-tonomous vehicles,” Mar. 26 2015, uS Patent App. 14/034,130.

[32] E. Ohn-Bar and M. M. Trivedi, “Looking at humans in the age of self-driving and highly automated vehicles,” IEEE Transactions on IntelligentVehicles, vol. 1, no. 1, pp. 90–104, March 2016.

[33] A. Skarbek-Zabkin and M. Szczepanek, “Autonomous vehicles and theirimpact on road infrastructure and user safety,” in 2018 XI InternationalScience-Technical Conference Automotive Safety, April 2018, pp. 1–4.

[34] D. Asljung, J. Nilsson, and J. Fredriksson, “Using extreme value theoryfor vehicle level safety validation and implications for autonomousvehicles,” IEEE Transactions on Intelligent Vehicles, vol. 2, no. 4, pp.288–297, Dec 2017.