Skip to content

Commit

Permalink
First commit
Browse files Browse the repository at this point in the history
  • Loading branch information
PApostol committed Oct 16, 2021
0 parents commit 02fac53
Show file tree
Hide file tree
Showing 9 changed files with 521 additions and 0 deletions.
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
## Spark-submit
##### Latest version: 0.1.0

#### 0.1.0 (2021-10-16)
- First release
21 changes: 21 additions & 0 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
MIT License

Copyright (c) 2021-2022 PApostol

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
153 changes: 153 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,153 @@
## Spark-submit

[![PyPI version](https://badge.fury.io/py/spark-submit.svg)](https://badge.fury.io/py/spark-submit)
[![Github All Releases](https://img.shields.io/github/downloads/PApostol/spark-submit/total.svg)]()
[![](https://img.shields.io/badge/python-3.5+-blue.svg)](https://www.python.org/downloads/)
[![License](https://img.shields.io/badge/License-MIT-blue)](#license "Go to license section")
[![contributions welcome](https://img.shields.io/badge/contributions-welcome-brightgreen.svg?style=flat)](https://github.com/PApostol/spark-submit/issues)

#### TL;DR: Python manager for spark-submit jobs

### Description
This package allows for submission and management of Spark jobs in Python scripts via [Apache Spark's](https://spark.apache.org/) `spark-submit` functionality.

### Installation
The easiest way to install is using `pip`:

`pip install spark-submit`

To install from source:
```
git clone https://github.com/PApostol/spark-submit.git
cd spark-submit
python setup.py install
```

For usage details check `help(spark_submit)`.

### Usage Examples
Spark arguments can either be provided as keyword arguments or as an unpacked dictionary.

##### Simple example:
```
from spark_submit import SparkJob
app = SparkJob('/path/some_file.py', master='local', name='simple-test')
app.submit()
print(app.get_state())
```
##### Another example:
```
from spark_submit import SparkJob
spark_args = {
'master': 'spark://some.spark.master:6066',
'deploy_mode': 'cluster',
'name': 'spark-submit-app',
'class': 'main.Class',
'executor_memory': '2G',
'executor_cores': '1',
'total_executor_cores': '2',
'verbose': True,
'conf': ["spark.foo.bar='baz'", "spark.x.y='z'"],
'main_file_args': '--foo arg1 --bar arg2'
}
app = SparkJob('s3a://bucket/path/some_file.jar', **spark_args)
print(app.submit_cmd)
print(app.env_vars)
# monitor state in the background every x seconds with `await_result=x`
app.submit(use_env_vars=True, await_result=10)
print(app.get_state()) # 'SUBMITTED'
# do other stuff...
print(app.get_state()) # 'FINISHED'
```

#### Examples of `spark-submit` to `spark_args` dictionary:
##### A `client` example:
```
~/spark_home/bin/spark-submit \
--master spark://some.spark.master:7077 \
--name spark-submit-job \
--total-executor-cores 8 \
--executor-cores 4 \
--executor-memory 4G \
--driver-memory 2G \
--py-files /some/utils.zip \
--files /some/file.json \
/path/to/pyspark/file.py --data /path/to/data.csv
```
##### becomes
```
spark_args = {
'master': 'spark://some.spark.master:7077',
'name': 'spark_job_client',
'total_executor_cores: '8',
'executor_cores': '4',
'executor_memory': '4G',
'driver_memory': '2G',
'py_files': '/some/utils.zip',
'files': '/some/file.json',
'main_file_args': '--data /path/to/data.csv'
}
main_file = '/path/to/pyspark/file.py'
app = SparkJob(main_file, **spark_args)
```
##### A `cluster` example:
```
~/spark_home/bin/spark-submit \
--master spark://some.spark.master:6066 \
--deploy-mode cluster \
--name spark_job_cluster \
--jars "s3a://mybucket/some/file.jar" \
--conf "spark.some.conf=foo" \
--conf "spark.some.other.conf=bar" \
--total-executor-cores 16 \
--executor-cores 4 \
--executor-memory 4G \
--driver-memory 2G \
--class my.main.Class \
--verbose \
s3a://mybucket/file.jar "positional_arg1" "positional_arg2"
```
##### becomes
```
spark_args = {
'master': 'spark://some.spark.master:6066',
'deploy_mode': 'cluster',
'name': 'spark_job_cluster',
'jars': 's3a://mybucket/some/file.jar',
'conf': ["spark.some.conf='foo'", "spark.some.other.conf='bar'"], # note the use of quotes
'total_executor_cores: '16',
'executor_cores': '4',
'executor_memory': '4G',
'driver_memory': '2G',
'class': 'my.main.Class',
'verbose': True,
'main_file_args': '"positional_arg1" "positional_arg2"'
}
main_file = 's3a://mybucket/file.jar'
app = SparkJob(main_file, **spark_args)
```
#### Additional methods

`spark_submit.system_info()`: Collects Spark related system information, such as versions of spark-submit, Scala, Java, Python and OS

`spark_submit.SparkJob.kill()`: Kills the running Spark job (cluster mode only)

`spark_submit.SparkJob.get_code()`: Gets the spark-submit return code

`spark_submit.SparkJob.get_output()`: Gets the spark-submit stdout

### License

Released under [MIT](/LICENSE) by [@PApostol](https://github.com/PApostol)

- You can freely modify and reuse
- The original license must be included with copies of this software
- Please link back to this repo if you use a significant portion the source code
40 changes: 40 additions & 0 deletions setup.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
"""spark-submit module Installation script"""
from setuptools import setup, find_packages
import os

info_location = os.path.join(os.path.abspath(os.path.dirname(__file__)), 'spark_submit', '__info__.py')
about = {}
with open(info_location, 'r') as f:
exec(f.read(), about)

with open('README.md', 'r') as f:
readme = f.read()

setup(
name = about['__title__'],
version = about['__version__'],
author = about['__author__'],
maintainer = about['__maintainer__'],
author_email = about['__author_email__'],
license = about['__license__'],
url = about['__url__'],
description = about['__description__'],
long_description_content_type="text/markdown",
long_description = readme,
packages = find_packages(),
include_package_data = True,
install_requires = ['requests'],
classifiers = [
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: MIT License',
'Topic :: Software Development :: Libraries',
'Topic :: Utilities',
'Programming Language :: Python :: 3',
'Operating System :: OS Independent'
],
zip_safe = True,
platforms = ['any'],
python_requires = '~=3.5',
keywords = ['apache', 'spark', 'submit'],
)
17 changes: 17 additions & 0 deletions spark_submit/__info__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
""" __ __ _ __
_________ ____ ______/ /__ _______ __/ /_ ____ ___ (_) /_
/ ___/ __ \/ __ `/ ___/ //_/_____/ ___/ / / / __ \/ __ `__ \/ / __/
(__ ) /_/ / /_/ / / / ,< /_____(__ ) /_/ / /_/ / / / / / / / /_
/____/ .___/\__,_/_/ /_/|_| /____/\__,_/_.___/_/ /_/ /_/_/\__/
/_/
"""

__title__ = 'spark-submit'
__author__ = 'PApostol'
__author_email__ = '[email protected]'
__maintainer__ = 'PApostol'
__license__ = 'MIT'
__version__ = '1.0.0'
__description__ = 'Python manager for spark-submit jobs'
__url__ = 'https://github.com/PApostol/spark-submit'
__bugtrack_url__ = f'{__url__}/issues'
17 changes: 17 additions & 0 deletions spark_submit/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
"""spark-submit module"""

from .sparkjob import SparkJob
from .system import system_info
from .__info__ import (
__title__,
__author__,
__author_email__,
__maintainer__,
__license__,
__version__,
__description__,
__url__,
__bugtrack_url__
)

__all__ = ['SparkJob', 'system_info']
8 changes: 8 additions & 0 deletions spark_submit/exceptions.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
"""Exceptions raised by SparkJob class"""

class SparkSubmitError(Exception):
pass


class SparkJobKillError(Exception):
pass
Loading

0 comments on commit 02fac53

Please sign in to comment.