Video analysis of zebrafish behavior

One major form of data for our lab is the recorded video files of our zebrafish behavioral experiments. Each recording of the behavioral experiments must be quantified in an objective manner to compare and understand subtle changes in the swimming behavior of the zebrafish following drug administrations or environmental manipulations. While manual scoring by human visual assessment is sometimes necessary (e.g., for aggression or seizure assays), our main goal is to automate this analysis process as much as possible to decrease human-induced bias and increase the amount of useful data that can be derived from each experiment. These instructions are our current methods to understand zebrafish behavior by generating the coordinates of the swimming trajectory of each fish. Several individual software packages are necessary to edit and analyze the video files. Several additional software packages are necessary to analyze, summarize, and compare the coordinates to derive behavioral insights.

Acquiring video

Environmental variables

Recording high-quality video without disrupting the zebrafish’s natural behavior is a fine balance to strike between lighting conditions and leaving the fish undisturbed. We use enhanced lighting strategies, including a soft CFL backlight to increase the contrast between the fish and the white background in our videos. We hope to upgrade to infrared light sources in the near future as this will allow us to record behavior in complete darkness.

Equipment variables

We record videos using a battery-powered GoPro in medium aperture mode at 30 frames per second. The camera is mounted to a tripod, positioned roughly 30 cm away from and at the same vertical height as the behavioral tanks. This gives us a lateral (side-wise) view of four tanks at one time. The raw recorded video is stored on a microSD card until transferred to a desktop computer for further processing.

Modifying raw videos

Since the major level of analysis is the individual behavioral tank and each raw video records four behavioral tanks simultaneously, we need to split the raw video into four separate quadrants. In our laboratory, each of the four split videos represents one replicate of one dose of one treatment condition in one behavioral assay. We are hoping to automate this process in the future, possibly through Avidemux scripts but currently it is a lot of manual work through GUIs with mouse-clicks:

  1. Using Avidemux 2.6.12, open the raw video.
  2. Mute the audio track (Audio > Select Track > uncheck any boxes)
  3. Change Video Output to “Mpeg4 AVC (x264)”
  4. Click filters and select:
    • Transform > Crop - Adjust the four values to crop the video around one of the behavioral tanks. Try to crop it tight and eliminate all the background including the walls of the tank and the reflections of the fish in the surface of the water. The final video should just be the water-filled area with the swimming fish.
    • Transform > swsResize - Resize the video by changing the width value to “532”.
    • Colors > GreyScale - Transform to greyscale as the color information is not important for generating trajectories.
  5. Slice the video into an X-minute segment, using the A-B function. Use the arrow keys to navigate the video and place Point A a few seconds after the fish are poured into the behavioral tanks. These few seconds allow the fish to regain their orientation and normal swimming patterns before the segment begins. Add X minutes to point A and place point B on that resulting time. Now, Avidemux will save your A-B segment instead of the whole video.
  6. Save video as “/data/XX/zfYYY-ZZZ.avi”, where XX is the behavioral tank number and YYY-ZZZ represent the individual 3-digit ZF numbers of the fish in that tank.
  7. Repeat steps 4-6 for the three other behavioral tanks in the raw video. Each tank will have different values for the crop outlines. Each tank on the right side of the raw video will also have to be horizontally flipped to match the left side (Filters > Transform > Horizontal Flip).

Generating trajectories using idTtracker

These modified video files can then be analyzed by idTracker, a video-tracking software that detects the zebrafish in the video and tracks the trajectories of their movements. The quality of the generated swimming trajectories is highly affected by the above video parameters and quality. To analyze a video in idTracker:

  1. Load the video file into idTracker.
  2. Adjust the following variables for optimal fish detection. The goal is to highlight the fish in green without highlighting the background or tank walls. Adjust the intensity and minimum size variables to optimize the highlighting. It is a good idea to pick several random frames to check for proper fish detection throughout the video. This may need troubleshooting, based on slightly different lighting conditions across tanks, positions, and assays. This is why consistent lighting and video conditions are helpful from the start.
    • Adjust the number of fish in the tank (typically 3)
    • Adjust the intensity threshold (typically 0.50-0.80)
    • Adjust the minimum size of each fish (typically 100-250)
  3. Click start and idTracker will analyze the video. This may take 15+ minutes per video, depending on video length, number of fish, and computer specifications. This may also affect computer performance of other tasks, as idTracker analyses tend to be resource-heavy. When finished, idTracker will displayed the generated trajectories in a 3D graph and estimate the reliability of the identifications.
  4. Click “See results” and “_nogaps” to load the corrected trajectories into idPlayer for review. Record the “quality” percentages and click Run to begin reviewing the tracked video.
  5. idPlayer will display the video overlayed with the tracking results: fish are numbered and have short trails following behind them as they swim. Watch the tracked video and note any major errors, swaps, or skips in tracking.
  6. If tracking did not complete or resulted in poor tracking or major errors, please repeat steps 1-3 and adjust the parameters accordingly (Some videos may frustratingly need 5-10 reruns to optimize tracking fidelity).
  7. This produces a file that contains the X-Y coordinates for all fish in the tank for all frames of the video, along with identification probability estimates.

Modifications necessary to track pentylenetetrazole-treated fish

In our anticonvulsant assays, fish with chemically-induced seizures often lose their normal body posture and swim patterns. They may turn over on their backs, presenting a “flipped” fingerprint to idTracker. They may stop swimming entirely and lay motionless on the bottom of the tanks, therefore being interpreted by idTracker as part of the background. Treated fish may also regain swimming motion following one of these erratic swimming patterns, essentially “reappearing” in the video. Furthermore, the frequency and timing of these events are different for each fish and impossible to predict beforehand. These videos cause idTracker to produce errors in tracking the fish because the software does not have the capability of varying the number of expected fish throughout the length of the video. Therefore, when a fish displays a PTZ-related seizure, the following modifications are necessary to obtain high-quality tracking data.

  1. As described above, generate trajectories using idTracker for the PTZ-treated tank. Review the tracked video in idPlayer.
  2. If there is a significant PTZ-related event, make note of the number of frames of good tracking prior to the event. Continue watching the video and note any further frames when idTracker tracked all fish in the tank correctly for a significant time (>60 frames).
  3. These video clips should be re-analyzed separately by restricting idTracker to that range of frames and specifying the total number of expected fish (N).
  4. For the incorrectly tracked periods following a PTZ-related event, re-analyze that range of frames and specify the new number of expected fish (typically N-1) in the clip.
  5. Review each new trajectory for correct tracking of all fish that are actively swimming in the clip.
  6. Combine the generated trajectory files for each clip into the total video trajectory.

Next step: Trajectory analysis using R