An awesome, free initiative called the Million Song Dataset. It’s a great candidate dataset for a MapReduce project, and a great start if you want to write your own song-recognition algorithm.
An awesome, free initiative called the Million Song Dataset. It’s a great candidate dataset for a MapReduce project, and a great start if you want to write your own song-recognition algorithm.
Python allows you to dynamically compile things at the module-level. That’s why the compile() builtin keyword accepts sourcecode and dictionaries of locals and globals, and doesn’t provide a direct way to call a fragment of dynamic (read: textual) sourcecode with arguments (“xyz(arg1, arg2)”). You also can’t directly invoke compiled-code as a generator (where a function that uses “yield” is interpreted as a generator rather than just a function).
However, there’s a loophole, and it’s very elegant and consistent to Python. You simply have to wrap your code in a function definition, and then pull the function from the local scope. You can then call it as desired:
import hashlib
import random
def _compile(arg_names, code):
name = "(lambda compile)"
# Needs to start with a letter.
id_ = 'a' + str(random.random())
code = "def " + id_ + "(" + ', '.join(arg_names) + "):n" +
'n'.join((' ' + line) for line in code.replace('r', '').split('n')) + 'n'
c = compile(code, name, 'exec')
locals_ = {}
exec(c, globals(), locals_)
return locals_[id_]
code = """
return a * b * c
"""
c = _compile(['a', 'b', 'c'], code)
print(c(1, 2, 3))
Simply inherit from the list class, and override the __iter__ method. A great example, from SO:
import json
def gen():
yield 20
yield 30
yield 40
class StreamArray(list):
def __iter__(self):
return gen()
a = [1,2,3]
b = StreamArray()
print(json.dumps([1,a,b]))