370CT: Concurrency Part Two

Dr Carey Pridgeon

Created: 2017-06-21 Wed 14:08

Concurrency Part Two

Semaphores

  • A semaphore has an internal counter rather than a lock flag.
  • Threads are allowed to obtain the semaphore until the internal counter hits a specified value.

    import threading
    max_connections = 10
    semaphore = threading.BoundedSemaphore(max_connections)
    semaphore.acquire()
    # do all the codes
    semaphore.release()
    

Re-Entrant Lock - 1

  • A Re-entrant lock will not block if the thread that already owns it tries to obtain it again, but will block if another thread tries to obtain it.

    lock = threading.Lock()
    lock.acquire()
    lock.acquire() # this will block in called by the owning thread
    lock = threading.RLock()
    lock.acquire()
    lock.acquire() # this won't block if called by the owning thread
    

Re-Entrant Lock - 2

  • The main use for this is nested access to shared resources.
  • It is recursive, so allows nesting or recursion in threads without messing up your code.
  • Since it keeps track of recursive 'obtains', you must 'release' it that number of times as well.

Events

  • An event is a simple synchronization object.
  • The event represents an internal flag.

    event = threading.Event()
    # a thread can wait for the flag to be set
    event.wait()
    # another thread can set or reset it
    event.set()
    event.clear()
    
  • You can use this to get one thread to tell another to run next.

Condition Variables

  • More complex then Events, but Python seems to make them look the same.
  • A Thread can be made to block until a Condition is satisfied.

    from threading import Lock
    from threading import Condition
    lock = Lock()
    cv = Condition(lock)
    cv.acquire() # block to wait for condition
    # do the work
    cv.notifyAll() # wake all threads waiting on this condition 
    or notify(n)   # wake one thread waiting on this condition
    cv.release()   # release the condition
    

The Producer Consumer Problem

  • A multi-process problem widely used to teach concurrency
  • Pretty much how the internet, and all of computing works.

    prodcon.png

The Buffer - 1

  • Common in concurrency, a buffer is used to compensate for the inevitable delays and timing issues involved when two processes exchange information.
  • Buffers are, in their simplest implemntation, FIFO structures. Data goes in one end and comes out the other

    buffer.png

The Buffer - 2

  • Out of order element transmission compensation is an issue set above the buffer, and we won't have time to cover that in detail.
  • In the toy producer consumer problem, the same buffer is shared and entirely ordered. In non trivia implementations often neither of these are true.
  • What are some examples of this?

The Producer - 1

  • A producer is anything that creates, or provides data.
    • A socket
    • A sensor
    • Any data source
  • Producers provide data and place it in a buffer to be read by any consumers
  • In the simple problem, there is just the buffer and comsumer signalling to be concerned with.

The Producer - 2

producer.png

The Consumer - 1

consumer.png

#+RESULTS

The Consumer - 2

  • Consumers remove elements from the buffer.
  • Buffers are usually cyclic, or ring structures.
  • If a simple array is used, the elements must be shifted forwards each time elements are removed, so room for fresh elements is created at the start of the array.
  • Ciculer buffers have pointers to both read and write locations that cycle round the buffer.
  • The read pointer can never overtake the write pointer.

Circular Buffer - 1

Circular_Buffer_Animation.gif

Circular Buffer - 2

  • A circuler buffer is only circuler in concept, not in reality.
  • It would be a one dimensional array, where if either the read or write pointer reaches the end of the array it moves to the start.
  • If the read pointer catches up to the write pointer it needs to wait.
  • You most often see this as buffering on youtube or other video sites (literally meaning waiting for the buffer to fill with more data).

Circular Buffer 3

cyclic_buffer.png

#+RESULTS

The lock Queue problem

  • As more threads access the same data, the queues they must wait in the access the critical sections get longer.
  • More time can be spent in the queue then actually working.
  • In small threaded programs this is not an issue, but over a certain level of complexity you need to move to Parallel, not Concurrent, programming
  • We will start covering this next week.

Credits

circuler buffer gif: MuhannadAjjan 1/12/2015, en.wikipedia.org