|
1 | 1 | # RecSync Android
|
2 | 2 |
|
3 |
| -TODO: panorama gif |
4 |
| - |
5 | 3 | ## Main ideas:
|
6 |
| - - Triggering at some time in the future is no longer needed(?) |
7 |
| - - New types of RPCs: start and stop video |
8 |
| - - ... profit! |
| 4 | +- Triggering at some time in the future is no longer needed |
| 5 | +- New types of RPCs are used: start and stop video |
9 | 6 |
|
10 | 7 | ## Our contribution:
|
11 | 8 |
|
12 |
| - - Integrated sync video |
13 |
| - - Scripts for extraction, alignment and processing of video frames |
14 |
| - - Experiment with flash blinking to evaluate video frames synchronization accuracy |
15 |
| - - Panoramic video demo with automated Hujin stitching |
| 9 | +- Integrated **synchronized video recording** |
| 10 | +- Integrated sync video |
| 11 | +- Scripts for extraction, alignment and processing of video frames |
| 12 | +- Experiment with flash blinking to evaluate video frames synchronization accuracy |
| 13 | +- Panoramic video demo with automated Hujin stitching |
16 | 14 |
|
17 |
| -## Video panorama stitching demo |
| 15 | +## Panoramic demo |
18 | 16 |
|
19 |
| -- in progress on another branch |
| 17 | +- We provide scripts to **stitch 2 syncronized smatphone videos** with Hujin panorama CLI tools |
| 18 | +- Usage: |
| 19 | + - Run ```./make_demo.sh {VIDEO_LEFT} {VIDEO_RIGHT}``` |
20 | 20 |
|
21 |
| -# Wireless Software Synchronization of Multiple Distributed Cameras |
| 21 | +## This work is based on "Wireless Software Synchronization of Multiple Distributed Cameras" |
22 | 22 |
|
23 | 23 | Reference code for the paper
|
24 | 24 | [Wireless Software Synchronization of Multiple Distributed Cameras](https://arxiv.org/abs/1812.09366).
|
25 | 25 | _Sameer Ansari, Neal Wadhwa, Rahul Garg, Jiawen Chen_, ICCP 2019.
|
26 |
| - |
27 |
| -If you use this code, please cite our paper: |
28 |
| - |
29 |
| -``` |
30 |
| -@article{AnsariSoftwareSyncICCP2019, |
31 |
| - author = {Ansari, Sameer and Wadhwa, Neal and Garg, Rahul and Chen, Jiawen}, |
32 |
| - title = {Wireless Software Synchronization of Multiple Distributed Cameras}, |
33 |
| - journal = {ICCP}, |
34 |
| - year = {2019}, |
35 |
| -} |
36 |
| -``` |
37 |
| - |
38 |
| -_This is not an officially supported Google product._ |
39 |
| - |
40 |
| - |
41 |
| -_Five smartphones synchronously capture a balloon filled with red water being popped to within 250 μs timing accuracy._ |
42 |
| - |
43 |
| -## Android App to Capture Synchronized Images |
44 |
| - |
45 |
| -The app has been tested on the Google Pixel 2, 3, and 4. |
46 |
| -It may work on other Android phones with minor changes. |
47 |
| - |
48 |
| -Note: On Pixel 1 devices the viewfinder frame rate drops after a couple |
49 |
| -captures, which will likely cause time synchronization to be much |
50 |
| -lower in accuracy. This may be due to thermal throttling. |
51 |
| -Disabling saving to JPEG or lowering the frame rate may help. |
52 |
| - |
53 |
| -### Installation instructions: |
54 |
| - |
55 |
| -1. Download [Android Studio](https://developer.android.com/studio). When you |
56 |
| - install it, make sure to also install the Android SDK API 27. |
57 |
| -2. Click "Open an existing Android Studio project". Select the "CaptureSync" |
58 |
| - directory. |
59 |
| -3. There will be a pop-up with the title "Gradle Sync" complaining about a |
60 |
| - missing file called gradle-wrapper.properties. Click ok to recreate the |
61 |
| - Gradle wrapper. |
62 |
| -4. Plug in your Pixel smartphone. You will need to enable USB debugging. See |
63 |
| - https://developer.android.com/studio/debug/dev-options for further |
64 |
| - instructions. |
65 |
| -5. Go to the "Run" menu at the top and click "Run 'app'" to compile and install |
66 |
| - the app. |
67 |
| - |
68 |
| -Note: By default, the app will likely start in client mode, with no UI options. |
69 |
| - |
70 |
| -#### Setting up the Leader device |
71 |
| - |
72 |
| -1. On the system pulldown menu of the leader device, disable WiFi. |
73 |
| -2. [Start a hotspot](https://support.google.com/android/answer/9059108). |
74 |
| -3. After this, opening the app on the leader device should show UI options, as |
75 |
| - well as which clients are connected. |
76 |
| - |
77 |
| -#### Setting up the Client(s) device |
78 |
| - |
79 |
| -1. Enable WiFi and connect to the leader's hotspot. |
80 |
| -2. As client devices on the network start up, they will sync up with the |
81 |
| - leader, which will show up on both the leader and client UIs. |
82 |
| -3. (Optional) Go to wifi preferences and disable "Turn on Wi-Fi automatically" |
83 |
| - and "Connect to open networks", this will keep devices from automatically |
84 |
| - disconnecting from a hotspot without internet. |
85 |
| - |
86 |
| -#### Capturing images |
87 |
| - |
88 |
| -1. (Optional) Press the phase align button to have each device synchronize |
89 |
| - their phase, the phase error will show in real-time. |
90 |
| -2. (Optional) Move the exposure and sensitivity slider on the leader device to |
91 |
| - manually set 2A values. |
92 |
| -3. Press the `Capture Still` button to request a synchronized image slightly in |
93 |
| - the future on all devices. |
94 |
| - |
95 |
| -This will save to internal storage, as well as show up under the Pictures |
96 |
| -directory in the photo gallery. |
97 |
| - |
98 |
| -Note: Fine-tuning the phase configuration JSON parameters in the `raw` resources |
99 |
| -directory will let you trade alignment-time for phase alignment accuracy. |
100 |
| - |
101 |
| -Note: AWB is used for simplicity, but could also be synchronized with devices. |
102 |
| - |
103 |
| -### Information about saved data |
104 |
| - |
105 |
| -Synchronized images are saved to the external files directory for this app, |
106 |
| -which is: |
107 |
| - |
108 |
| -``` |
109 |
| -/storage/emulated/0/Android/data/com.googleresearch.capturesync/files |
110 |
| -``` |
111 |
| - |
112 |
| -A JPEG version of the image will also populate in the photo gallery under the |
113 |
| -`Pictures` subdirectory under `Settings -> Device Folders`. |
114 |
| - |
115 |
| -Pulling data from individual phones using: |
116 |
| - |
117 |
| -``` |
118 |
| -adb pull /storage/emulated/0/Android/data/com.googleresearch.capturesync/files /tmp/outputdir |
119 |
| -``` |
120 |
| - |
121 |
| -The images are also stored as a raw YUV file (in |
122 |
| -[packed NV21 format](https://wiki.videolan.org/YUV)) and a metadata file which |
123 |
| -can be converted to PNG or JPG using the Python script in the `scripts/` |
124 |
| -directory. |
125 |
| - |
126 |
| -#### Example Workflow |
127 |
| - |
128 |
| -1. User sets up all devices on the same hotspot WiFi network of leader device. |
129 |
| -2. User starts app on all devices, uses exposure sliders and presses the |
130 |
| -`Phase Align` button on the leader device. |
131 |
| -3. User presses capture button on the leader device to collect captures. |
132 |
| -4. If JPEG is enabled (default) the user can verify captures by going to the |
133 |
| -`Pictures` photo directory on their phone through Google Photos or similar. |
134 |
| -5. After a capture session, the user pulls the data from each phone to the local |
135 |
| -machine using `adb pull`. |
136 |
| -6. (Optional)The python script is used to convert the raw images using: |
137 |
| -``` |
138 |
| -python3 yuv2rgb.py img_<timestamp>.nv21 nv21_metadata_<timestamp>.txt |
139 |
| -out.<png|jpg>. |
140 |
| -``` |
141 |
| - |
142 |
| -## How Networking and Communications work |
143 |
| - |
144 |
| -Note: Algorithm specifics can be found in our paper linked at the top. |
145 |
| - |
146 |
| -Leader and clients use heartbeats to connect with one another and keep track of |
147 |
| -state. Simple NTP is used for clock synchronization. That, phase alignment and |
148 |
| -2A is used to make phones capture the same type of image as the same time. |
149 |
| -Capturing is done by sending a trigger time to all devices which will |
150 |
| -independently capture at that time. |
151 |
| - |
152 |
| -All of this requires communication. One component of this library is to provide |
153 |
| -a method for sending messages (RPCs) between the leader device and client |
154 |
| -devices, to allow for synchronization as well as capture triggering, AWB, |
155 |
| -state etc. |
156 |
| - |
157 |
| -The network uses wifi with UDP messages for communication. The leader IP is |
158 |
| -determined automatically by client devices. |
159 |
| - |
160 |
| -A message is sent as an RPC byte sequence consisting of an integer method ID |
161 |
| -(defined in |
162 |
| -[`SyncConstants.java`](app/src/main/java/com/googleresearch/capturesync/softwaresync/SyncConstants.java) |
163 |
| -and the string message payload. (defined in |
164 |
| -[`SoftwareSyncBase.java`](app/src/main/java/com/googleresearch/capturesync/softwaresync/SoftwareSyncBase.java) |
165 |
| -`sendRpc()`) |
166 |
| - |
167 |
| -Note: This app has the leader set up a hotspot, through this client devices can |
168 |
| -automatically determine the leader IP address from the connection, however one |
169 |
| -could manually configure IP address with a different network configuration, such |
170 |
| -as using a router that all the phones connect to. |
171 |
| - |
172 |
| - |
173 |
| -### Capture |
174 |
| - |
175 |
| -The leader sends a `METHOD_SET_TRIGGER_TIME` RPC (Method id located in |
176 |
| -[`SoftwareSyncController.java`](app/src/main/java/com/googleresearch/capturesync/SoftwareSyncController.java) |
177 |
| -) to all the clients containing a requested capture |
178 |
| -synchronized timestamp far enough in the future to account for potential network |
179 |
| -latency between devices. In practice network latency between devices is ~100ms |
180 |
| -or less, however the latency may be more or less depending on what devices or |
181 |
| -network configuration is used. |
182 |
| - |
183 |
| -Note: In this case the future is 500ms, giving plenty of time for network |
184 |
| -latency. |
185 |
| - |
186 |
| -Each client and leader receives the future timestamp and `CameraController.java` |
187 |
| -checks the timestamp of each frame as it comes in and pulls the closest frame at |
188 |
| -or past the desired timestamp and saves it to disk. One advantage of this method |
189 |
| -is that if any delays happen in capturing, the synchronized capture timestamp |
190 |
| -will show that the time offset between images without requiring looking at the |
191 |
| -images. |
192 |
| - |
193 |
| -Note: Zero-shutter-lag capture is possible if each device is capable of storing |
194 |
| -frames in a ring buffer. Then when a desired current/past capture timestamp is |
195 |
| -provided each device can check in the ring buffer for the closest frame |
196 |
| -timestamp and save that one. |
197 |
| - |
198 |
| -### Heartbeat |
199 |
| - |
200 |
| -A leader listens for a heartbeat from any client, to determine if a client |
201 |
| -exists and whether starting the synchronization with that client is necessary. |
202 |
| -When it gets a heartbeat from a client that is not synchronized, it initiates an |
203 |
| -NTP handshake with the client to determine the clock offsets between the two |
204 |
| -devices |
205 |
| - |
206 |
| -A client continuously sends out `METHOD_HEARTBEAT` RPC to the leader with it's |
207 |
| -current boolean state for if it's already synchronized with the leader. |
208 |
| - |
209 |
| -A leader received `METHOD_HEARTBEAT` and responds with a `METHOD_HEARTBEAT_ACK` |
210 |
| -to the client. The leader uses this to keep track of a list of clients using the |
211 |
| -`ClientInfo` object for each client, which will also include sync information. |
212 |
| - |
213 |
| -The client waits for a `METHOD_OFFSET_UPDATE` from the leader which contains the |
214 |
| -time offset needed to get to a synchronized clock domain with the leader, after |
215 |
| -which it's heartbeat messages will show that it is synced to the leader. |
216 |
| - |
217 |
| -Whenever a client gets desynchronized, the heartbeats will notify the leader of |
218 |
| -it and they will re-initiate synchronization. Through this mechanism automated |
219 |
| -clock synchronization and maintenance is achieved. |
220 |
| - |
221 |
| -### Simple NTP Handshake |
222 |
| - |
223 |
| -The |
224 |
| -[`SimpleNetworkTimeProtocol.java`](app/src/main/java/com/googleresearch/capturesync/softwaresync/SimpleNetworkTimeProtocol.java) |
225 |
| -is used to perform an NTP handshake between the leader and client. The local |
226 |
| -time domain of the devices is used, using the |
227 |
| -[`Ticker.java`](app/src/main/java/com/googleresearch/capturesync/softwaresync/Ticker.java) |
228 |
| -method for getting local nanosecond time. |
229 |
| - |
230 |
| -An NTP handshake consists of the leader sending a message containing the current |
231 |
| -leader timestamp t0. The client receives and appends it's receiving local |
232 |
| -timestamp t1, as well as the timestamp it sends a return message to the leader |
233 |
| -t2. The leader receives this at timestamp t3, and using these 4 times estimates |
234 |
| -the clock offset between the two devices, accounting for network latency. |
235 |
| - |
236 |
| -This result is encapsulated in |
237 |
| -[`SntpOffsetResponse.java`](app/src/main/java/com/googleresearch/capturesync/softwaresync/SntpOffsetResponse.java) |
238 |
| -which also contains the hard upper bound timing error on the offset. In practice |
239 |
| -the timing error is an order of magnitude smaller since wifi network |
240 |
| -communication is mostly symmetric with the bias accounted for by choosing the |
241 |
| -smallest sample(s). |
242 |
| - |
243 |
| -More information can be found in our paper on this topic. |
244 |
| - |
245 |
| -### Phase Alignment |
246 |
| - |
247 |
| -The leader sends out a `METHOD_DO_PHASE_ALIGN` RPC (Method id located in |
248 |
| -`SoftwareSyncController.java`) to all the clients whenever the Align button is |
249 |
| -pressed. Each client on receipt then starts a phase alignment process (handled |
250 |
| -by `PhaseAlignController.java`) which may take a couple frames to settle. |
251 |
| - |
252 |
| -Note: The leader could instead send its current phase to all devices, and the |
253 |
| -devices could align to that, reducing the total potential error. For simplicity |
254 |
| -this app uses a hard-coded goal phase. |
255 |
| - |
256 |
| -### Exposure / White Balance / Focus |
257 |
| - |
258 |
| -For simplicity, this app uses manual exposure, hard-coded white balance, and |
259 |
| -auto-focus. The leader uses UI sliders to set exposure and sensitivity, which |
260 |
| -automatically sends out a `METHOD_SET_2A` RPC (Method id located in |
261 |
| -[`SoftwareSyncController.java`](app/src/main/java/com/googleresearch/capturesync/SoftwareSyncController.java) |
262 |
| -) to all the clients, which update their 2A as |
263 |
| -well. Technically 2A is a misnomer here as it is only setting exposure and |
264 |
| -sensitivity, not white balance. |
265 |
| - |
266 |
| -It is possible to use auto exposure/sensitivity and white balance, and have the |
267 |
| -leader lock and send the current 2A using the same RPC mechanism to other |
268 |
| -devices which can then set theirs manually to the same. |
269 |
| - |
270 |
| -Note: One could try synchronizing focus values as well, though in practice we |
271 |
| -found the values were not accurate enough to provide sharp focus across devices. |
272 |
| -Hence we keep auto-focus. |
0 commit comments