My first experience with programming was at university with the statistical programming language R. I used R as part of my undergraduate lab work, to perform statisical tests and create graphics/data plots.
During my dissertation lab work, I began using Python. I used Python to create a novel map of the protein-protein interactome of Drosophila melanogaster to that of Anopheles gambiae orthologs. This was used to identify genes of interest in Anopheles gambiae in Plasmodium falciparum resistance. One gene of particular significance in Plasmodium infection resistance was discovered as a result.
In addition throughout my undergraduate degree I taught myself, and used, the typesetting langauge LaTeX.
A reduced protein-protein interactome of Anopheles gambiae generated from orthologs of Drosophila melanogaster proteins
After my time at Imperial I was a Financial Consultant at Fideres for two years. As part of my role there I was frequently involved in the analysis of large datasets (such as financial instrument tick data), particularly in the discovery of anomalies indicative of foul play. This involved heavy usage and understanding of the data analysis library pandas for Python, with the creation of US court complaint ready graphics using matplotlib.
My work at Fideres also had me work on the conception and testing of potential trading strategies. Due to the specificity of the strategy this involved the creation of a novel backtesting and simulation library using Python.
I was also part of the inception of a new arm of the business called Fideres Analytics. This provided a web-based platform of tools for US law firms to aid in their work, and the genesis of cases. Some of this work this programming was collaborative using git.
Graphs produced as part of my work on Fideres Analytics, namely a stock price & volume graph indicating days with significant stock price movements, and plots outlining autocorrelation for stock returns.
A large part of my contribution to Fideres Analytics was the production of a daily automated PDF report called the 'Securities Monitor'. This report would perform a statistical test for significant stock price drops daily to bring to the attention of US law firms. This work also included the created of a stock news API built by myself, this API would provide news links, headlines and dates for a given stock ticker, this API was created using packages such as BeautifulSoup and requests. Further to this I wrote many natural language processing tools for keyword analysis on said scraped news, this included the use of nltk alongside novel solutions.
I now work with businesses and individuals who require websites and web apps. I use a mixture of frameworks and technologies depending on the needs of the client, including ExpressJS, Handlebars, React, and GatsbyJS. I also have experience in RDBMS, namely using PostgreSQL.
In addition, I also provide images for businesses web usage through my photography company Through Ben's Lens.
Check out some of my web projects below.
A list of the tools I use every day for programming: