# Density stratification in fine-grained rivers

2017 was the year of my first talk at the fall meeting of the American Geophysical Union. It was pretty exciting and I was extremely nervous. In the end, it went okay, and I was able to present my work to a broad range of scientists. I presented my ongoing research based on field survey of the Yellow River, China during flood.

I hypothesized that the river would exhibit a density stratification in the flow. Density stratification occurs in a river because the entrainment of sediment into the flow affects the properties of the flow in bulk. I’ll explain with the help of a few graphics below. The below image is a profile of an open channel flow (thick black line is the channel bed) and the top of the graphic is the water surface.

a. the velocity profile of a steady and uniform open channel flow is well described by the logarithmic “law-of-the-wall” or log-law. This log-law takes the form of which predicts time-averaged velocity (u bar) as a function of the shear velocity (u*) and the log of the height above the bed (z). z0 is a reference height very near the bed, and κ is a constant. The equation, evaluated over the flow depth is shown on the left side of figure a. The implication of higher velocities near the surface means that momentum (ρ=mv) of the flow is higher near the flow surface than the bed. This condition is unstable, so momentum is redistributed from the surface to the bed through mass transfer. When the flowing mass of high momentum fluid reaches the bed, it dissipates, forming turbulent eddies that shed off the channel bed and move up into the water column.

b. the turbulent eddies coming off the channel bed cause sediment to be entrained into the flow and brought up from the bed towards the surface. The vertical distribution of sediment through the water column depends on the size of particles and the entraining velocity and can be estimated by the exponential Rouse equation.which predicts time-averaged concentration (c bar) as a function of the time-averaged near-bed concentration (cb bar) as a function of the height (z) above the bed (b) to the flow depth (H) and the Rouse number (ZR) which balances the settling velocity of particles (ws), to the entraining shear velocity (u*). The Rouse equation and log-law work well only in dilute suspensions, that is, flows in which the concentration of sediment is small enough to have no feedback on the flow.

c. In flows where the sediment concentrations are significant enough near the bed to have a feed back on the system a density stratification develops. In short, the high concentration of sediment prevents momentum redistribution from the surface fully reaching the bed, which has the net effect of reducing sediment suspension and enhancing flow velocities near the surface.

Because sediment transport (qs is width averaged transport) is the product of the velocity and concentration profiles integrated over the flow depth:this density stratification could significantly alter total sediment transport rates in fine-grain rivers from existing predictions. My ongoing research in this field is trying to resolve precisely what conditions lead to the development of a density stratification in the river.

This material is based upon work supported by the National Science Foundation Graduate Research Fellowship under Grant No.1450681. Any opinion, findings, and conclusions or recommendations expressed in this material are those of the authors(s) and do not necessarily reflect the views of the National Science Foundation.

# Hurricane Harvey update

The Greater Houston Metro just got pounded by one of the largest storms (in terms of rainfall) on record: Hurricane Harvey. There was widespread flooding across all parts of the metro, but my home and Rice university were largely spared. The skies are clear today, after ~5 days of nearly continuous rainfall. However, the cleanup effort for this disaster will last years and cost billions of dollars. Nonetheless, I am certain that Houston will come back stronger and better than ever. This ordeal has made me even more proud to be a Houstonian.

At my house, we had a small leak in an interior bathroom, but because we received so much rain over the duration of the event, the roof became saturated and the sheetrock partially collapsed. No one was hurt, thankfully, and repairs are already underway.

I am currently working on an NSF RAPID grant, which I compiled the following figure for. This figure shows total rainfall during the Harvey event with the City of Houston (CoH) labeled. The precipitation data were collected from the National Weather Service and then summed to produce the total rainfall numbers for the below plot. I am proposing some work on the Brazos River (BR) so this feature is also labeled.

# Outreach module — Flooding risk in low-lying landscapes

I have put together an outreach module that describes some of the risks of flooding in low-lying landscapes. The module runs in Matlab, either within a licensed environment, or with the Matlab Runtime Environment (which is available to anyone).

Accompanying the GUI is a worksheet that steps through all the aspects of the GUI and attempts to demonstrate the principles of flooding in deltas without detailing the math or physics behind the model. My hope is that it will be helpful to High School educators in their course work.

So far, I have only written a worksheet targeted at 9-12th graders, but plan to write two more (one for younger students, and one for more advanced undergraduate/graduate students) worksheets in the near future.

Below is a demonstration of the GUI. The full information for the module (including the source code) is on my GitHub here. The project website is here.

This material is based upon work supported by the National Science Foundation (NSF) Graduate Research Fellowship under Grant No.145068 and NSF EAR-1427177. Any opinion, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

# Accepted article — Continental divide location and mobility

My first peer-reviewed manuscript was recently accepted for publication in Basin Research! The manuscript explores the controls on continental divide location and mobility. In particular we explore the impacts of exogenic (allogenic) forcings that adjust topographic gradients in the landscape and ascertain what the relevant length scales for driving continental divide migration may be.

See the full publication here: http://onlinelibrary.wiley.com/doi/10.1111/bre.12256/full

A preview (submission version) can be found here, in accordance with the terms of the license agreement.

Appalachian 150 km filter overlain with contours of calculated total rock deformation (thick gray lines) since 3.5 Ma, resulting from the combined effects of mantle induced dynamic topography and the flexural response of the lithosphere to unloading and loading of sediments across the surface (Moucha & Ruetenik, 2017). Prince et al. (2011) suggest that the Roanoke River (R) will eventually capture the headwaters of the New River (N), causing the actual divide to jump farther west, ultimately approaching or reaching the synthetic divide. This prediction is consistent with patterns of rock deformation (Moucha & Ruetenik, 2017) and calculated χ values (Fig. 9), and crudely co-located with the Central Appalachian Anomaly (CAA) tomographically imaged by Schmandt & Lin (2014). Inset shows the record of sediment flux off the Appalachians into the Atlantic passive margin Baltimore Canyon Trough basin (Pazzaglia & Brandon, 1996). The unsteady flux is characterized by pulses in increased sediment deposition that are interpreted to result from large-scale drainage captures that rapidly incise an enlarged Atlantic slope drainage area. FZ – Fall Zone, scarp – Orangeburg, Chippenham, and Thornburg scarps from Rowley et al. (2013).

——————————————————————————

Exogenic forcing and autogenic processes on continental divide location and mobility

The position and mobility of drainage divides is an expression of exogenic landscape forcing and autogenic channel network processes integrated across a range of scales. At the large scale, represented by major rivers and continental drainage divides, the organization of drainage patterns and divide migration reflects the long-wavelength gradients of the topography, which are exogenically influenced by tectonics, isostasy, and/or dynamic topography. This analysis utilizes long-wavelength topography synthesized by a low-pass filter, which provides a novel framework for predicting the direction of divide movement as well as an estimate of the ultimate divide location, that is complementary to recent studies that have focused on the χ channel metric. The Gibraltar Arc active plate boundary and Appalachian stable plate interior, two tectonically diverse settings with ongoing drainage system reorganization, are chosen to explore the length scales of exogenic forcings that influence continental drainage divide location and migration. The major watersheds draining both the active and decay-phase orogens studied here are organized by topographic gradients that are expressed in long-wavelength low-pass filtered topography (λ ≥ 100 km). In contrast, the river network and divide location is insensitive to topographic gradients measured over filtered wavelengths < 100 km that are set by local crustal structures and rock type. The lag time between exogenic forcing and geomorphic response and feedbacks cause divide migration to be unsteady, and occur through pulses of drainage capture and drainage network reorganization that are recorded in sedimentological, geomorphic, or denudation data.

This material is based upon work supported by the National Science Foundation (NSF) Graduate Research Fellowship under Grant No.145068 and NSF EAR-1427177. Any opinion, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

# China field work 2017

I’m heading off tomorrow for my third summer field campaign in China. That also means that I’ve completed three years of my PhD.

I always seem to act a bit introspective around this time of the year, reevaluating decisions, remembering achievements, and reliving failures. This year hasn’t been especially easy for me; I’ve lost two important people in my life, and I’ve struggled to get my research moving at a pace that I feel is fast enough. Nonetheless, I’ve done a lot of good things this year too; I will be an author on (at least) two papers coming out this year, I won a service award for my department, and I am starting a new series of symposia at Rice that I’m especially excited about.

Our field campaign marks another journey halfway around the world to collect data on one of the most exceptional rivers in the world. My research goals for this year are to collect water column data during a flood. My analysis of the last two years’ data suggests that at high discharges, the concentrations of sediment become sufficiently high to dampen turbulence in the flow and introduce “density stratification”. I’m hoping to constrain the development of the density stratification in the lower Yellow River during this field campaign.

I’ll be working with some of the best people on our field campaign. Our campaign will be led by myself, Brandee Carlson, my advisor Jeff Nittrouer, and Hongbo Ma, and we will be helped (immensely) by Tian Dong, Chenliang Wu, Eric Barefoot, Dan Parsons, and Austin Chadwick.

Cheers for another year! Wish us luck!

Brandee Carlson and me towards the end of our 2016 field campaign

# LaTeX CV formatting

I’ve recently spent quite a bit of time trying to get my CV formatted exactly the way I want, using LaTeX. Those who know me, know that I love LaTeX, probably the point of being a snob about it…so naturally my CV has to be made with LaTeX. The final product can be found here.

The bulk of the CV was pretty easy to work up, I just did most of it manually with the help of a just a few critical packages (e.g., tabularx, enumitem). What was important to me though was to develop an easily maintainable method for keeping my CV up to date with publications and presentations. I use Zotero to manage references, so exporting the references to a .bib file to be processed in my LaTeX CV seemed like an obvious place to start.

Below is a style file I have created, providing the package cvlist, in order to take the exported .bib file containing all of the papers and conferences I have been a part of and format them into a beautiful LaTeX list. To do so I’m using BibLaTeX to process the .bib file.

There were a few things that I wanted to have my CV list set up for:

1. multiple sections for different types of “references”, e.g., peer reviewed, other publications, and conferences proceedings
2. reverse counting of items within each section, such that the most recent items come first and have the highest number
3. the ability to have publications in various states (e.g., submitted, in prep) and format them properly into the reference
4. provide links (formatted as a clean single word “[link]”) for items which have a URL included in the .bib file

This is mostly achieved within the style file and package provided by cvlist. The code comes from various places all over the internet (mostly stackexchange though <3), and I have tried to attribute where possible. The package can be found here.

The only things required in the actual CV .tex file then are to call the package, add the bib resource, and define the section filters. I have chosen to define the section filters in the actual .tex file, but they could have been switched into the style file by default too.

```\usepackage{cvlist} \addbibresource{../CV_biblist.bib} % define the filters that will separate out the sections printed \defbibfilter{peer}{type=article and not subtype=nopeer} \defbibfilter{nopeer}{not type=inproceedings and subtype=nopeer} \defbibfilter{conf}{type=inproceedings}```

Finally, the reference sections are printed with calls

```\nocite{*} \section*{Refereed Publications} \printbibliography[filter=peer,heading=none]```

```\section*{Other Publications} \printbibliography[filter=nopeer,heading=none,resetnumbers]```

```\section*{Scientific Presentations with Abstracts} \printbibliography[filter=conf,heading=none,resetnumbers]```

I don’t expect that my solution will solve the problems of anyone trying to do something similar to what I have done, but I do hope it provides a good starting point. The package can be found here.

# The exceptional sediment load of fine-grained dispersal systems: Example of the Yellow River, China — Ma et al., 2017

We’ve just published an exciting new paper in Science Advances which assess the transport of sediment in fine-grain river systems. The research is driven by Postdoctoral Researcher Hongbo Ma, who is the first author on the publication. I led the field survey and processed the Multibeam data of the Yellow River channel bed that you see in a few of the figures in the paper.

Hongbo has identified a physical explanation for why fine-grain rivers are able to move so much sediment. In short, it has to do with the organization of the channel bed, whereby dunes are wiped out at high Froude number flows with a small grain size on the bed. This reduces the form drag in the river and allows for more skin friction on sediment to bring into suspension. Hongbo continues to make strides in identifying a “phase transition” in sediment transporting systems that helps to explain the observations made in the Yellow River.

Yellow River at Hukou Waterfall. The river here is a bedrock-alluvial river, but this image provides a good demonstration of the comparatively massive volume of sediment transported by the Yellow River

You can get the paper here, or uploaded to my site as a pdf here.
There is a full article about the research with loads more information, including a video interview, from the Rice press department here. And an article from the National Science Foundation (funding organization) here. A Scientific American video explaining some of the research is available here.

# Vibracore extraction tripod engineering drawings — Vibracore system

For our research in China, I was charged with building a Vibracore system. The Vibracore works by utilizing a concrete vibrator to rapidly vibrate an upright thin-walled aluminium pipe into the sand/dirt/mud below. A tripod is then set up over the in-ground pipe to pull it up from the ground. The pipe (now called a core I suppose…) is then cut open with a saw and analyzed/sampled.

dimetric view of assembled tripod

This system is nothing we invented, although I’m not sure of its origin. I based the design for our tripod on an apparatus that our colleague John Anderson has in his collection of field equipment. Our only substantial modification to the design was to make the legs of our system separable so that instead of a solid 10′ pipes of aluminium, we have two 5′ pipes, joined by a coupler. This is quite useful for us, since we send our system to China each year, and it makes it much easier to handle for shipping.

I recently made some engineering drawings of our system for a colleague and figured I would share them here in case they may be helpful to others. You can find the plans as a .pdf file here, or explore the system in three dimensions in the software they were designed in (OnShape CAD) at this link.

example drawing: head assembly top plate

Rice crew Vibracoring the Yellow River delta

In the future, I hope that my colleague Brandee Carlson (who leads the research using the Vibracore) and I can write a bit of an updated guide to the system based on our experiences using the system in the field, but for now I’ll just leave you with a few references for the system design below.

Land-based Vibracoring and Vibracore analysis: Tips, Tricks, and Traps. Occasional Paper 58. Thompson, T. A., Miller, C. S., Doss, P. K., Thompson, L. D. P., and Baedke. 1991.

Collection and analysis techniques for paleoecological studies in coastal-deltaic settings — Robert A. Gastaldo

# Building a simple delta numerical model: Part VI

This will be the final piece of the model that we need to get to have a working code for delta growth: the time routine. We will define a few more terms in order to set up the model to be able to loop through all the time steps and update the evolving delta.

```T = 500; % yrs timestep = 0.1; % timestep, fraction of years t = T/timestep; % number of timesteps dtsec = 31557600 * timestep; % seconds in a timestep```

T is the total time the model will be run for (in years), timestep is the fraction of year that will be simulated with each timestep, expressed in seconds as dtsec, and t is the number of timesteps to loop through. Now we simply take our block of code that we’ve built up to calculate all the propertyies of the delta (slope, sediment transport, deposition, etc.) and surround it with a for statement:

```for i = 1:t [S] = get_slope(eta, nx, dx); % bed slope at each node [H] = get_backwater_fixed(eta, S, H0, Cf, qw, nx, dx); % flow depth U = Qw ./ (H .* B0); % velocity [qs] = get_transport(U, Cf, d50, Beta); qsu = qs(1); % fixed equilibrium at upstream [dqsdx] = get_dqsdx(qs, qsu, nx, dx, au); [eta] = update_eta(eta, dqsdx, phi, If, dtsec); end```

To explain in words the above block, now, we are going to go through every timestep i and calculate the slope of the bed (eta) everywhere in the model domain, then we use this slope (and other defined parameters) to determine the flow depth and velocity through the entire delta. The velocity is used to calculate sediment transport, and the change in sediment transport over space produces a change in the channel bed elevation over time (i.e., the Exner equation). Finally, we return to the top of the loop to calculate a new slope based on our new bed.

That’s it! Our delta model is now complete and we can outfit it with all sorts of bells and whistles to test hypotheses about delta evolution in the natural (or experimental) world. A powerful tool indeed!

Below is a simple movie output from this model that shows the results of our hard work! The complete code for the delta model can be found here.

Note that there is a small instability that grows at the front of the sediment wedge, this isn’t a huge problem depending on what you want to do with your model, but you can tweak dx and dt to make things run “smoother” (see the CFL number for more information).

This material is based upon work supported by the National Science Foundation (NSF) Graduate Research Fellowship under Grant No.145068 and NSF EAR-1427177. Any opinion, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

# The NPR budget and federal spending – Opinion piece

I wrote this up in response to something someone shared on Facebook (silly, I know) but I was curious to have some facts on what the numbers are. I’m reproducing some of the content from the shared page for context, and because giving them clicks seems counter to my point:

Source: truthfeed.com

Anyway, below is what I wrote about it. Note that I didn’t actually post this on public Facebook, but I did send it to the sharer. Sources provided where relevant, but nothing was too rigorously vetted. BEWARE: opinions and not science below!!

——————————————————————————————————

NPR received 14% of it’s 2015 operating budget from government grants at all levels (only 9% I can verify are from Federal grants) [Fig. 1, http://www.npr.org/about-npr/178660742/public-radio-finances]. The NPR operating budget in FY 2015 was ~198 million [p. 30, http://www.npr.org/about/annualreports/FY15_annualreport.pdf], and so let’s say that conservatively, that no more than 28 million came from the Federal budget. The Federal budget in 2015 was 3.7 trillion; over half (62%) of this is required spending (social security, healthcare, interest on debt), but the remaining budget (1.2 trillion) went to discretionary spending [https://www.cbo.gov/sites/default/files/114th-congress-2015-2016/graphic/51112-discretionaryspending.pdf]. This is where NPR’s budget “lives”; note that NPR actually don’t get federally marked funds, but instead receive Federal funds through grants from the Corporation for Public Broadcasting.

28 million out of 1.2 trillion is about 0.0023% of the Federal discretionary budget that went to NPR in 2015. NPR estimates they reach 45 million unique and regular listeners [http://nationalpublicmedia.com/npr/audience/, of course this is hard to verify…] which means that of ~320 million Americans [https://factfinder.census.gov/faces/tableservices/jsf/pages/productview.xhtml?src=bkmk] about 1 out of every 7 listens to NPR each month. This is at a cost to the Federal government of \$0.62 per listener per year, or a burden on the American tax payer (averaged) of ~\$0.12 per year (243 million pay taxes [https://www.reference.com/government-politics/many-u-s-taxpayers-d77a9265390f4bdb]).

The Federal discretionary budget sent 582 billion to Defense spending in 2015. That’s just under half of the entire Federal discretionary budget. The number is even larger when you consider defense “related” expenses [https://en.wikipedia.org/wiki/Military_budget_of_the_United_States]. Averaging the same way as above, this means a burden of ~\$2,395 on the American tax payer. I obviously recognize that some pay more, many pay less, but I’m just trying to make an argument about disparity in the order of magnitude of these numbers. I won’t go on here, because it’s hard to make an argument about defense spending efficiency with facts because 1) they tend to be obfuscated by the reporting agencies, and 2) the facts are extremely complex and I’m far from an expert.

The Federal agencies that support arts and sciences (e.g., NPR, NEA, NSF, NASA, NIH) make Americans smarter and cost mere pennies on the dollar that is given to other portions of the government operating budget. You can do a similar exercise to what I did above for NPR to any of these organizations and the story is the same. NPR educates and entertains Americans on a wide range of subjects, the NEA provides culture and dignity to cities around the country, the NSF supports cutting edge research in engineering, physics, math, psychology, Earth science, and more at institutions across the country that ultimately leads to technology which improves the lives of every American, NASA does the same and arguably leads the world in space exploration and planetary science, and the NIH funds research on all living things that makes us healthier and happier people every day. I implore you, don’t look to non-profits supporting the arts and sciences in an effort to curb federal spending (which I support by the way), when these organizations are generally efficient by pure necessity, and they provide immense benefits to the millions of individuals that comprise our great Nation.