I think the common wisdom around UAC is that you can't really optimize for cost: Google's machinery is designed to deliver maximum efficiency, and since there's no transparency as to how that process actually works (it's a "black box" algorithm), advertisers are better off setting bids at some target (say, 110% ROI) and simply scaling their budget up and down to optimize delivery.
This makes sense at a conceptual level for a number of reasons. First, these kinds of algorithms take time to "tune" (iterate into optimal parameters), and bid changes reset that process, so it's best to leave bids at a static value and let the algorithm do what it can on delivery.
Additionally, assuming that Google's machinery is good at delivering efficiency (which I don't think anyone disagrees with -- if Google can't run digital marketing campaigns efficiently, then who can?), it makes sense that Google would find the most efficient targeting parameters and then deliver whatever it could against some budget; if Google could delivery more efficiency for an advertiser, why would it not? It's in Google's best interest to make its advertisers' campaigns successful so that their revenues and thus their budgets grow. If that's the case, then the bid isn't really a lever to pull in the optimization process: Google is doing what it can to deliver the best-performing traffic already.
I think the best way of optimizing these campaigns is to run an iterative creative testing program that starts with broad themes and gets more granular once obivous winning themes are found. So you might start a campaign with just three or four creative templates that are all themed dramatically differently: when one outperforms the others, you create variants of that and continue that process until further iterations don't produce improvements.
The other approach is to experiment with the optimization events that are sent back from the app to UAC (which Google uses to evaluate traffic and set targeting parameters). Different events might create dramatically different target sets -- eg. tutorial complete vs. first purchase -- and there's often some tension between the volume of the events and the strength of the signal they carry, and finding the right balance between volume and strength takes time via trial and error.
The big practical change that algorithmic campaign optimization has introduced to mobile UA is that "campaign management" isn't really a button-pushing, lever-pulling process anymore: buying traffic efficiently actually involves running lots of experiments and trying creative concepts systematically.