
20 Jul Data Collection and Data Viz!
Hi,
All of the work we have done to prepare our fog monitoring station would have been in vain if we did not get it running properly.
So, we went out a week later and collected the data from the field. Here is an instructional video of collecting data out in the field. Enjoy
After we collected the data, we wanted to see our results, so used R to create some tidy data visualization graphs.
We wrote scripts that would allow us to convert the raw data into usable and processed data. Here is an example of code we wrote that automates the data wrangling process.
stations = list("TP", "SB", "NN", "EB")
for(i in stations){
# read in all .dat files to csv
df<-list.files(path = paste("./raw_data/",i,"/", sep = ""),
pattern="*.dat",
full.names = TRUE) %>%
ldply(read.csv, skip = 1)
# change to numeric
df[-1] <- sapply(df[-1], as.numeric)
# pull only unique rows
df <- distinct(df)
# get rid of empty rows
df<- subset(df, RECORD >= 0)
# save files to processed folder
write.csv(df, file = paste("./processed_data/",i,"/",i,"_processed_CR800.csv", sep = ""))
This pulls raw unprocessed data from the raw data folder, then processes the data, converts data to csv, and finally automatically sorts the processed data to the appropriate processed folder.
After the data was processed, we tried our hand at some data viz. A cool trick I learned was to animate the graph using gganimate!
This plot shows the leaf wetness sensor measurings from June 20th. As you can see the fog spikes up in the early mornings and drops off around 9 am.

Here is another plot that shows the day to day fluctuations of fog.
It was really fun writing code and creating data viz plots, and was rewarding to finally see the end product of our project this summer.
Thanks for tuning in.
Taro
Sorry, the comment form is closed at this time.