#12/7/2018
Since I am done implementing my APIs and writing unit tests, I have therefore done everything on my gantt chart prior
to this week. This week, I helped create the poster for intersctions (around 6-8 hours worth of work).
#11/30/2018
Implement xy axis labeling for twitter and instagram plots
Implement a global maxima label for twitter and instagram plots
Truncated dates to their minutes for the x-lables, to better show the plots
Fixed a minor file path issue for the twitter plot blobfile
Since I finished my work ahead of schedule, this week, I am also working on
the poster for intersections. This is not tracked in the git repo.
#11/23/2018 Thanksgiving break: gap week
#11/13/2018
Using bs4, I made a parser to parse the ID's out of account webpages.
I made follow/unfollow/block and other related functions.
I encountered a flaw in the InstagramAPI, and documented this edge case in my
parser helper methods.
I created graphs for dynamic instagram information over time
(e.g. followers / time)
#11/9/2018
I moved a bio method to the main twitter module
so that the scheduler can dynamic dispatch this method.
Implemented posting media/photos for an Instagram account
Created functions to handle all types of input for bio updating,
as well as string formatting for certain operations
(e.g. phone numbers)
Add instagram tokens to the config parser, as well as Implemented
the api interface.
#11/2/2018
Reads the database scheduler, and returns a JSON object of current
tasks to schedule.
Writes to the database, using datetime's timestamp to provide
easy comparison for the reader operations
Did various work on the setup db file, fixed minor errors
Refactored the database schema to accomodate job ID's and
also multiple platforms per job
#10/26/2018
Found in AI_test.py
Tested the polarity of sentences
Tested the sentiment conversion from a sample graphfile
Tested sentiment output for a blobfile
Found in csv_test.py
Tested read column reading, ignoring the headers
Tested CSV setup for the twitter streaming
Tested CSV contents of the setup file
#10/19/2018
Implemented methods for removing posts, getting
follower information, posting a photo, and posting text.
Impelemented a limit on the hashtag streaming, so the streamer
stops after a set amount of tweets found.
Implements getters for all info about a user profile, such as
their twitter ID, latest tweet, tweet from a given ID, latest
favorites, and retweet information.
Impelements getters and updaters for bio operations, such as
updating ones bio, updating their name, and getting name and
bio information.
Implemented a graphing file in CSV format, that stores datetime
info about a streamed tweets
Impelemnted a blob file, containing textblob sentiment analysis
on streamted tweets
After filtering realtime tweets for the streamer, I was able to
load the tweet JSON object and extract the text of the tweet. I
implemented a streamlimed proccess for storing datetime info,
using a rudimentary AI to analyze the sentiment positivity, and
graph the positivity over time.
#10/12/2018
Filters realtime tweets based on a hashtag:
Continuously streams tweets in real time, no delay is detected
Writes results to an output file called tweets.txt
Since I gave myself next week to implement Pagination and Posting operations, I will
delay these until next week.
Also, I worked on integrating pylint and mypy into precommit:
Our team cannot "git commit" anything until the pylint and mypy
syntax and style checkers detect zero erros across every file.
This ensures that our files are consistent and maintainable.
I fixed all week1-3 code that was prior to my code, to conform with the pylint and
mypy precommit checks. Now, our team is up-to-date regarding this.
Created a setup file for first-time users who download/clone the postr project
Running this file creates a database to hold scheduling operations, called
master_schedule.sqlite.
Job:
Holds data for arbitrary text/media posts, with an optional field for
any additional info that social media sites require
Bio:
Holds data for social media bios, with the option of using a
display name instead of one's first and last name
Person:
Holds data for an arbitrary person, which can be linked from
a bio row through a foreign key
DailyJob:
Holds data for an arbitrary job (connected through a foreign key).
This job can have set intervals and frequencies, and will continue to
operate until its frequency limit for the day is met.
MonthlyJob:
Holds data for an arbitrary job (connected through a foreign key).
This job can have set intervals and frequenceis, and will continue to
operatre until its frequency limit for the month is met.
CustomJob:
Holds data for an arbitrary job(connected through a foreign key).
Can be executed on a custom date. The purpose of this table is to hold
jobs that have separate use cases from a daily and monthly job.
I also made Inserter.py:
Given a conneciton to master_schedule.py, inserts rows for
Person, Job, CustomJob, and Bio tables. The rest of the tables will
be implemented upon further discussion.
This enforces style and other metrics
Notable features:
max of 5 arguments per funciton
max of 10 complexity per function
Run:
"pylint file.py --rcfile=pylintrc" in /Postr directory
This sets up a dev environment in a user's home directory
This file is not tracked, but a copy can be found here:
Note: this link may expire after some time
https://pastebin.com/raw/xA4BnP99
Notable features:
Python autocompletion
Syntastic plugin:
Checks for a pylintrc file, and automatically runs it when file is saved.
Run:
Automatically loaded when put into the HOME directory
Notable features:
Calls pylint for syntax checking
Allows PEP8 analysis of code
Run:
"flake8 file.py"
Notable features:
Allows for static type checking with the typing module
Enforces very strict type checking as defined in mypy.ini
Run:
"mypy file.py"