BufferQueue improvements and APIs changes
this is the first step of a series of improvements to
BufferQueue. A few things happen in this change:
- setSynchronousMode() goes away as well as the SynchronousModeAllowed flag
- BufferQueue now defaults to (what used to be) synchronous mode
- a new "controlled by app" flag is passed when creating consumers and producers
those flags are used to put the BufferQueue in a mode where it
will never block if both flags are set. This is achieved by:
- returning an error from dequeueBuffer() if it would block
- making sure a buffer is always available by replacing
the previous buffer with the new one in queueBuffer()
(note: this is similar to what asynchrnous mode used to be)
Note: in this change EGL's swap-interval 0 is broken; this will be
fixed in another change.
Change-Id: I691f9507d6e2e158287e3039f2a79a4d4434211d
diff --git a/include/gui/CpuConsumer.h b/include/gui/CpuConsumer.h
index 5f1e369..2890350 100644
--- a/include/gui/CpuConsumer.h
+++ b/include/gui/CpuConsumer.h
@@ -67,7 +67,7 @@
// Create a new CPU consumer. The maxLockedBuffers parameter specifies
// how many buffers can be locked for user access at the same time.
CpuConsumer(const sp<BufferQueue>& bq,
- uint32_t maxLockedBuffers, bool synchronousMode = true);
+ uint32_t maxLockedBuffers, bool controlledByApp = false);
virtual ~CpuConsumer();