Enable Spark Context on Your Ipython Notebook

time to read 1 min | 76 words

When you’re trying spark with its python repl, it’s really easy to write stuff using simple function or lambda. However, it will be a pain in the ass when you’re starting to try some complex stuff because you could easily miss something like indentation, etc.

Try running your pyspark with this command

IPYTHON_OPTS="notebook" path/to/your/pyspark

It will start an IPython Notebook in your browser with Spark Context as sc variable. You could start using it like this:


Related Post:

  1. May 25, 2017 New interesting data structures in Python 3
  2. May 22, 2017 Keyword argument demystify
  3. May 04, 2017 Looping techniques in Python
  4. May 03, 2017 Enhance your tuples
  5. May 02, 2017 Get more with collections!
  6. May 01, 2017 There is more to copying
  7. Apr 30, 2017 Implementing weak references in Python
  8. Apr 26, 2017 Next, Function or Method ?
  9. Apr 24, 2017 Generator Expressions
  10. Apr 23, 2017 Yield Keyword
  11. Apr 21, 2017 What are Generators?
  12. Apr 16, 2017 Lambda Functions in Python
  13. Apr 06, 2017 Function in Python are First-Class Object
  14. Apr 05, 2017 Django 1.11 Release Note a Reading
  15. Apr 03, 2017 One Hell Named JSON
  16. Dec 26, 2016 Queue in Python - Part 3
  17. Nov 02, 2016 Queue in Python - Part 2
  18. Nov 02, 2016 Queue in Python - Part 1
  19. Apr 27, 2015 EAFP Coding Style in Python
  20. Jul 24, 2014 Kompresi CSS menggunakan Python