''' threading_run_background_deco2.py
apply threading to a function with a function decorator
to allow it to run in the background
tested with Python27 and Python33  by  vegaseat  27jun2014
'''

import time
import threading


def background(f):
    '''
    a threading decorator
    use @background above the function you want to run in the background
    '''
    def bg_f(*a, **kw):
        threading.Thread(target=f, args=a, kwargs=kw).start()
    return bg_f


@background
def counter(name, n):
    """show the count every second"""
    for k in range(1, n+1):
        print("{} counts {}".format(name, k))
        time.sleep(1)

# start the counters
# note that with the @background decorator
# Frank and Doris count simultaneously
# note from SOS: Frank may not always count first
counter("Frank", 5)
counter("Doris", 5)

'''
result with the @background decorator ...
Frank counts 1
Doris counts 1
Frank counts 2
Doris counts 2
Frank counts 3
Doris counts 3
Frank counts 4
Doris counts 4
Frank counts 5
Doris counts 5

result without the @background decorator ...
Frank counts 1
Frank counts 2
Frank counts 3
Frank counts 4
Frank counts 5
Doris counts 1
Doris counts 2
Doris counts 3
Doris counts 4
Doris counts 5
'''

Just for the clarification of the readers; the output posted using the background decorator is not always guaranteed and depends on how your OS schedules the Python threads. So another possible output can be:

Frank counts 1
Doris counts 1
Frank counts 2
Doris counts 2
Doris counts 3
Frank counts 3
Doris counts 4
Frank counts 4
Doris counts 5
Frank counts 5

(in case you haven't noticed, Doris and Frank changed places after the 2nd count)

Thanks SOS!
After running the code about a dozen times, I also found that Frank does not always count first in the sequence. The count however is still simultaneous.

You could also do this with a Process.

import functools
import multiprocessing
def background(func):
    @functools.wraps(func)
    def wrapper(*args, **kwargs):
        p = multiprocessing.Process(target=func, args=args, kwargs=kwargs)
        p.start()
    return wrapper

Also, I wanted to point something out for new people that may be reading this. I think it's a good practice to use the functools.wraps decorator when making things like this.

Example:

def decoratorwithout(f):
    def wrapper():
        return f()
    return wrapper

def decoratorwith(f):
    @functools.wraps(f)
    def wrapper():
        return f()
    return wrapper

@decoratorwithout
def myfuncwithout():
    """ You can't see this with help() or other tools. """
    return None

@decoratorwith
def myfuncwith():
    """ This is correct documentation. """
    return None

Checking the docs:

help(myfuncwith)

Help on function myfuncwith in module __main__:

myfuncwith()
    This is correct documentation.

And the function without functools.wraps:

help(myfuncwithout)

Help on function wrapper in module __main__:

wrapper()

functools.wraps sets the __doc__ and other things to help avoid confusion when wrapping functions.

print(myfuncwithout.__name__)
wrapper

print(myfuncwith.__name__)
myfuncwith

Edited 2 Years Ago by chriswelborn: no quotes on print output.

I always use the @functools.wraps(f) decorator for decorator functions to help with debugging.

@vegaseat, from the Python doc page for multiprocessing:

It runs on both Unix and Windows.

It's stdlib stuff so I assumed it was cross-platform, but to be honest I've never tested it on Windows.

I see on the doc page that there are some differences with spawn and fork in the multiprocessing API itself. Windows has spawn and fork, but uses spawn by default. Unix only has fork pre Python 3.4, and it uses fork by default. Support for spawn was added in 3.4. I see no differences in the Process API, which by the way is compatible with the Thread API (awesome).

Edited 2 Years Ago by chriswelborn: Explanation of differences between Linux and Windows.

I tested multiprocessing approach on Windoze and it failed:

import time
import functools
import multiprocessing


def background(func):
    @functools.wraps(func)
    def wrapper(*args, **kwargs):
        p = multiprocessing.Process(target=func, args=args, kwargs=kwargs)
        p.start()
    return wrapper


@background
def counter2(n):
    """show the count every second"""
    for k in range(n):
        print("counting %d" % k)
        time.sleep(1)

# start the counter
counter2(7)
time.sleep(0.9)
print('hello')   # prints hello, before counter prints 1
time.sleep(3)
print('world')   # prints world, before counter prints 4

''' should give result ...
counting 0
hello
counting 1
counting 2
counting 3
world
counting 4
counting 5
counting 6

... instead gives this error (Windows7) ...
Traceback (most recent call last):
  File "C:\Python34\Atest34\aatest34\multiprocessing_run_background_deco1.py", line 43, in <module>
    counter2(7)
  File "C:\Python34\Atest34\aatest34\multiprocessing_run_background_deco1.py", line 18, in wrapper
    p.start()
  File "C:\Python34\lib\multiprocessing\process.py", line 105, in start
    self._popen = self._Popen(self)
  File "C:\Python34\lib\multiprocessing\context.py", line 212, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "C:\Python34\lib\multiprocessing\context.py", line 313, in _Popen
    return Popen(process_obj)
  File "C:\Python34\lib\multiprocessing\popen_spawn_win32.py", line 66, in __init__
    reduction.dump(process_obj, to_child)
  File "C:\Python34\lib\multiprocessing\reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj)
_pickle.PicklingError: Can't pickle <function counter2 at 0x02E6A1E0>: it's not the same object as __main__.counter2
'''

Multiprocessing on Windows does work, but is a bit crippled when compared to multiprocessing on *nix. For e.g., the following code snippet works fine on Windows:

from __future__ import print_function
import time
import functools
import multiprocessing

def runFuncInMp(func, *args, **kwargs):
    p = multiprocessing.Process(target=func, args=args, kwargs=kwargs)
    p.start()

def counter(n):
    """show the count every second"""
    for k in range(n):
        print("counting %d" % k)
        time.sleep(1)

if __name__ == '__main__':
    # start the counter
    runFuncInMp(counter, 7)
    time.sleep(0.9)
    print('hello')   # prints hello, before counter prints 1
    time.sleep(3)
    print('world')   # prints world, before counter prints 4

Notice the removal of the decorator and adding in the if __name__ == '__main__' check.

from __future__ import print_function
import time
import functools
import multiprocessing

def runFuncInMp(func, *args, **kwargs):
    p = multiprocessing.Process(target=func, args=args, kwargs=kwargs)
    p.start()

def counter(n):
    """show the count every second"""
    for k in range(n):
        print("counting %d" % k)
        time.sleep(1)

if __name__ == '__main__':
    # start the counter
    runFuncInMp(counter, 7)
    time.sleep(0.9)
    print('hello')   # prints hello, before counter prints 1
    time.sleep(3)
    print('world')   # prints world, before counter prints 4

Sorry to say, but the output using Windows 8.1 is not the desired one. All you get is:
hello
world

Sorry to say, but the output using Windows 8.1 is not the desired one. All you get is:

Are you running this from the command line or some sort of an IDE? It works fine for me on Windows 8.1

D:\misc>python test.py
counting 0
hello
counting 1
counting 2
counting 3
world
counting 4
counting 5
counting 6

@~s.o.s~
You are right, running it from a DOS batch file works. Using any of several IDEs does not do well. Looks like the IDE output window in this case blocks multiprocessing from working properly.

The original threading version does not have this problem.

I don't want to go off on a tangent here, but maybe some nice person can test HiHe's multiprocessor-decorator version on a Unix box.

Edited 2 Years Ago by vegaseat

some nice person can test HiHe's multiprocessor-decorator version on a Unix box.

It works fine on a *nix box (as expected)

sos@ubuntu_14.04:~/personal$ python test.py 
counting 0
hello
counting 1
counting 2
counting 3
world
counting 4
counting 5
counting 6

Edited 2 Years Ago by ~s.o.s~

A little look at concurrent.futures
concurrent.futures has a minimalistic API for threading and multiprocessing.
Only change one word to switch ThreadPoolExecutor(threading) and ProcessPoolExecutor(multiprocessing).
concurrent.futures is backportet to Python 2.7

A look at ProcessPoolExecutor(multiprocessing)

from __future__ import print_function
from time_code import timeit
import concurrent.futures
import time

def counter(n):
    """show the count every second"""
    for k in range(n):
        print("counting {}".format(k))
        time.sleep(1)

@timeit
def main():
    with concurrent.futures.ProcessPoolExecutor(max_workers=4) as executor:
        for i in range(20):
            executor.submit(counter, i)

if __name__ == "__main__":
    main()

For Windows if __name__ == "__main__": and run from command line to see print function.
It can work from a IDE if no need to see output from print function.

So here max_workers=4 i got a time of 55 sec.
So if a spawn load over more processes,i should see a faster time.
Change to max_workers=30 and time go down to 19 sec.

The @timeit code i use for this.

#time_code.py
import time

def timeit(f):
    '''Timing a function'''
    def timed(*args):
        ts = time.time()
        result = f(*args)
        te = time.time()
        print 'func:{!r}  {!r} took: {:.2f} sec'.format\
        (f.__name__, args, te-ts)
        return result
    return timed

Edited 2 Years Ago by snippsat

Responding to the PicklingError that HiHe observed:
The multiprocessing package offers both local and remote concurrency, effectively side-stepping the Global Interpreter Lock (GIL) by using subprocesses instead of threads. Due to this only picklable objects can be executed and returned.

I tested some of the multiprocessing examples in the above article. On a Windows machine they only work if you use the commandline option to run it. In other words the ugly DOS window.

The article starter has earned a lot of community kudos, and such articles offer a bounty for quality replies.