Enable Spark Context on Your Ipython Notebook

time to read 1 min | 76 words

When you’re trying spark with its python repl, it’s really easy to write stuff using simple function or lambda. However, it will be a pain in the ass when you’re starting to try some complex stuff because you could easily miss something like indentation, etc.

Try running your pyspark with this command

IPYTHON_OPTS="notebook" path/to/your/pyspark

It will start an IPython Notebook in your browser with Spark Context as sc variable. You could start using it like this:


Related Post:

  1. May 30, 2017 New interesting data structures in Python 3
  2. May 23, 2017 Python 'With' Statement and Context Manager
  3. May 22, 2017 Keyword argument demystify
  4. May 04, 2017 Looping techniques in Python
  5. May 03, 2017 Enhance your tuples
  6. May 02, 2017 Get more with collections!
  7. May 01, 2017 There is more to copying
  8. Apr 30, 2017 Implementing weak references in Python
  9. Apr 26, 2017 Next, Function or Method ?
  10. Apr 24, 2017 Generator Expressions
  11. Apr 23, 2017 Yield Keyword
  12. Apr 21, 2017 What are Generators?
  13. Apr 16, 2017 Lambda Functions in Python
  14. Apr 06, 2017 Function in Python are First-Class Object
  15. Apr 05, 2017 Django 1.11 Release Note a Reading
  16. Apr 03, 2017 One Hell Named JSON
  17. Dec 26, 2016 Queue in Python - Part 3
  18. Nov 02, 2016 Queue in Python - Part 2
  19. Nov 02, 2016 Queue in Python - Part 1
  20. Apr 27, 2015 EAFP Coding Style in Python