This parametric equation is called a rose. I believe that this example is a Starr rose, but I had to derive the equation (and integrate it into the third dimension) myself, as little beyond a picture exists on the Internet. Created in Maple.
This page chronicles my academic achievements. Things such as scholarships, acceptance into honor societies, and my research are shown here. The items in this category need not be related to a formal institution. In particular, my research, while benefiting from resources such as the university library, is generally detached from my coursework. At best, the topics I've learned in a class will be a small subset of what I need for my research, but it is not unusual for the two to be completely disjoint.
I defended my dissertation, entitled "Mining Complex High-Order Datasets", on April 23, 2010, and graduated with my Ph. D. in Computer and Information Sciences (GPA: 3.92) from Temple University on May 13, 2010.
My dissertation work is based primarily on higher-order analogues of singular value decomposition (high-order SVD, PARAFAC, and Tucker) coupled with a variety of wavelet-based techniques and involved creation of a comprehensive data mining framework for classification, co-clustering, concept discovery (e.g. detecting handedness in subjects based on fMRI scans), summarization, and compression of high-order spatiotemporal datasets, with principal applications in fMRI. In the course of this work I also devised a tensor-theoretic multidimensional discrete wavelet transform, extended the WaveCluster algorithm to accept real-valued high-order image data, created a novel clustering algorithm based on WaveCluster called TWaveCluster, mitigated the massive partial volume effect inherent in grid-based segmentation (by deforming the grid in a context-sensitive manner prior to clustering), released the first public implementation of WaveCluster (which I have opensourced under the GPLv3), and discovered quite a few interesting new results about motor and cognitive task activation patterns in the human brain.
For instance, I discovered that performing difficult cognitive tasks lights up the same areas of the brain that process physical pain, added evidence supporting a hypothesis that ipsilateral activation is important in motor task based learning (most activation is contralateral), and localized the spatiotemporal patterns of activation found in working memory tasks.
My next task is to fuse functional and structural data in a tensor model and to begin reading people's minds using the high-tech equivalent of phrenology :)
Though not in my dissertation, I also created texture-based classifiers for diagnosing breast and brain cancer and created a galactographic ductal topology classifier for many types of breast pathology.
Despite this, my goal is to move away from diagnosis and into developing treatments. I have far more medical knowledge than my formal background suggests, but this is a difficult industry to convince of any new innovation.
Date: May 13, 2010.
I received a Master's Degree in Computer and Information Sciences from Temple University (as a continuing Ph. D. student) on August 31, 2007, one year and three days after enrolling. My GPA was 3.89. My master's project is the Medical Image Data Mining System, a content-based image retrieval system for MRIs and other medical scans.
MIDMS was published in ISBI 2008, in the article "A Web-Accessible Framework for the Automated Storage and Texture Analysis of Biomedical Images".
Date: August 31, 2007.
On May 17, 2006, I graduated Summa Cum Laude from Monmouth University (with the highest GPA in my class - see Academic Achievement Award entry).
My degree is a Bachelor of Science in Computer Science with a Mathematics minor.
Date: May 17, 2006.
This award is given annually at Commencement to the student with the highest GPA in the graduating class.
In the Class of 2006 (about 1,050 students), that would be me.
In addition to the plaque and the honor of receiving an extremely prestigious award in front of thousands of people, I received a $5,000 monetary award for this accomplishment.
Date: May 17, 2006.
This is a paper that I began writing on August 6, 2007, and completed on August 7, 2007. It deals with classification of tissue type in individually extracted regions of interest from MRI image datasets. We achieve 89% classification accuracy between 11 tissue types using our texture-based approach in a spontaneous murine model.
Date: August 7, 2007.
Project Polymath is my ambition to start a new type of university that encourages student-driven, simultaneous, and modular learning in multiple fields, as well as instruction in techniques to creatively generate ideas and fuse principles from across disciplines. The idea is to spark a new Renaissance by training an entire generation of polymaths rather than relying upon the rare autodidactic individuals, as was done in the past.
I had thought over the idea for a few years beforehand, but first acted on it on December 21, 2007 by creating the website (it was completed on December 24, 2007). In it, I lay down the principles of my vision, philosophy, and plan for instituting the university.
I have also started a 501(c)(3) nonprofit organization called the Polymath Foundation dedicated to funding and managing the university. We were incorporated on March 20, 2008 and received 501(c)(3) status on September 11, 2008, after which we rapidly began drawing up a curriculum.
As of April 2009, we have a board of 5 trustees and 3 executive committee members, we have planned 5 courses and taught 2, and we have nearly 100 students, 500 supporters, and 50 prospective instructors. Not bad for the organization's one-year anniversary, right?
This is an ambitious project; one of my life's goals. Fortunately, I am not alone in this - we get multiple emails daily from all sorts of people (including faculty and administration at other universities) praising the concept, and have quite the dedicated network of supporters.
It's also a good example of how my web design has been evolving.
Of all of my efforts, this is likely the one to watch most closely.
Date: December 24, 2007.
I noticed an adjunct professor opening in Monmouth's Computer Science Department while browsing the university's news and decided that it was a good opportunity to try teaching out. I thus began teaching CS305 and CS503, the undergraduate and graduate courses in Data Structures, Algorithms, and Java programming, from Fall 2008 to Spring 2009.
This is when I first realized just how much I loved teaching.
Date: September 2, 2008.
The Polymath Lectures are a series of courses (mostly online) that we are offering at Project Polymath to generate interest and to raise funds. Over 50 prospective instructors have applied during the first month of the program.
I'm not merely organizing this and sitting idly by, however; I'm also acting as the instructor of the first course, "How To Create Your Own Website". This was one reason I sought a teaching position last year; to have the experience to act as an instructor here, with an innovative educational paradigm at stake.
Between an imminent dissertation defense, my normal teaching load at Monmouth, and the effort of teaching this course, my ability to multitask was pushed to its limit. I'm well past the point of adding significantly to my own burden, but it's a burden borne in the name of something right, and that makes all the difference.
Date: April 4, 2009.
This is a talk on educational pedagogy given for CS4HS events around the country. The target audience is a group of high school teachers with no CS background. Despite that assumption, I teach them both pigeonhole sort and maximum variance unfolding in order to demonstrate the power of intuition in learning (and it worked!) I use this to motivate a new model of technology education based on a minimal amount of highly intuitive presentation coupled with extensive hands-on learning.
It was received extremely well at Kean University, Duke University, and online for GoogleEDU, and more speaking engagements are on the way!
Date: June 27, 2012.
These are materials from the 2005 Science, Technology, and Engineering research conference that I attended. At this conference, I presented the Quantile Tree structure, which this site is named for, and an implementation of a knowledge-based programming platform.
This task was a critical test of my time-management and research abilities: I found out about the conference on April 1, three days before abstracts were due and less than one month before the actual conference. Furthermore, after deciding that I wished to present the quantile tree, still barely conceptualized at the time, at the conference, Dr. Scherl asked me to present my knowledge-based programming platform. Faced with the dilemma of choosing between my own important yet esoteric research and a demonstration of an interesting but less meaningful presentation, I chose to present both (setting in motion a motif that would follow me for my entire third decade). I completed a preliminary version of both abstracts by the deadline and spent every minute of free time over the next month developing the presentations.
My knowledge-based programming research was later published by AAAI Press.
Date: April 27, 2005.
This is actually very much like a political office: We each become representatives for students in our respective departments and meet routinely to discuss what can be improved at the university. The council also discusses the long-term strategic goals of the school. The student and industrial advisory councils, the latter being made up of very influential corporate and governmental executives, also meet annually to discuss matters of finance and what the industry expects of graduates. You may have heard about the "Centers of Excellence" being developed in certain sciences on Monmouth University's radio advertisements. That idea was formulated during a recent advisory council meeting.
Membership on the advisory council requires recommendation by a faculty member. Members must have a clear view of the "big picture" while at the same time noting details required to implement the council's goals. Key skills are consensus building, knowledge in and outside of one's major, and a responsible attitude towards those that you represent. The council routinely meets with some extraordinarily influential people, including corporate and university presidents and CEOs. Additionally, the council is influential in the decision-making process in its own right. As such, the utmost care and responsibility is required of its members.
For a brief summary of the purpose of the advisory council or the roster of either the student or industrial advisory council (as of 2005), click on the links below.
Date: February 11, 2005.
In the autumn of 2005, I decided to explore the realm of pure mathematical research. Six months later, I presented my findings at the 5th Annual STE Conference in a presentation titled "Properties and Applications of the Divisor Function".
As with the Quantile Tree, this research is independent. In a sense, this research is a test of my own intrinsic limits: how far I am able to go without any outside help whatsoever in a field that I have no previous experience with. After all, if I could arrive at a significant result with no training, how much more powerful will my insights be with years of training and experience? I am very satisfied with what I have discovered.
Some of these results have doubtless been proven by other mathematicians, but most of them are, as far as I can tell, new. All claims are proven mathematically, of course.
Proof of the special case k=1 is shown in the image; proof of the general case is contained in the presentation and is similar.
On November 7, 2008, I generalized my result to the following: .
The derivation (amidst the rest of the esoterica I was thinking about that day) is given in a somewhat messy Maple worksheet.
My next step is to extend this recurrence one more time to composite numbers. After that, I am going to attack the complete recurrence using the recurrence analysis tools I have learned for the analysis of algorithms, with the goal of establishing a tight bound on the divisor function. I have several reasons for wanting to do this.
Date: April 12, 2006.
On April 23, 2006, I received the Award for Excellence in Computer Science from Monmouth University. Michael Edwards, an exceptionally talented programmer and one of my closest friends, also won this award for the 2006 year.
This award is given by the department in recognition of exceptional achievement in computer science. It is given on the basis of leadership, scholarship, and service pertaining to the field of computer science.
Date: April 23, 2006.
I was inducted into the highly prestigious Omicron Delta Kappa honor society on May 2, 2006.
This honor society requires an application demonstrating leadership in five areas: Academics, Community Service, Athletics, Speech and Journalism, and the Creative and Performing Arts.
Date: May 2, 2006.
This is a paper I began as an assignment for an artificial intelligence class. I am continuing work on it since I am interested in the problem of improving the AI in my program. I may publish it at some point, but whether I do or not is largely irrelevant.
This paper is still a work-in-progress. You may view the current revision of the paper.
Date: January 27, 2007.
The Medical Image Data Mining System is a framework for automated medical (primarily brain) image dataset submission and analysis which I wrote as a Master's Project (with the help of Jingjing Zhang, another graduate student) over the course of just one week. The system is written in Perl and Matlab and employs a methodology that achieved 89% classification accuracy on MRI images of the brain in another study (which we submitted in a separate paper to the same conference). Thus the system has diagnostic accuracy sufficient for clinical use in applications such as cancer diagnosis.
The associated paper was presented at ISBI 2008 and published in proceedings.
Date: August 1, 2007.
This is a journal paper published in IEEE Transactions on Medical Imaging, Vol. 28, Issue 4, pp. 487-493. It significantly extends our branching pattern analysis approach and is conducted on a much larger dataset than our previous studies in this area.
Date: November 4, 2007.
An extension of the Project Polymath concept into grade-school education was long in coming. Nevertheless, it is an ideal time to move in this direction. Educational reform is coming to this country soon, and a program that seeks to instill creativity, innovation, and a love of learning will do very well when it does.
More than that, this is a stepping stone to the university - for its establishment and for its future students.
Date: October 23, 2009.
I began working as a Data Scientist at dstillery (formerly Media6Degrees) in December 2009. However, the title belies the fact that I do just as much software development there as I do data science, particularly using Java, Perl, Maven, Spring, and Hadoop - I was responsible for the production computation of our machine learning features as well as developing some new ones (using a wide array of statistical tools, such as SVD and KL divergence).
Date: December 7, 2009.
This is the first public implementation of the WaveCluster algorithm proposed by Sheikholeslami, Chatterjee, and Zhang. I wrote the implementation in Matlab.
Despite going as far as asking the authors (who did not respond), I was unable to find any existing code to use for a central clustering analysis to my thesis. So I had to write the algorithm myself.
I have filled in the missing details from the original paper and added a few additional enhancements.
The code is released under the GPL v3.
Date: April 21, 2010.
I built a functioning brain-computer interface using a Mindflex, then tied it into my home automation system and cooked up a machine learning algorithm which was able to get enough insight out of the noisy 1-channel sensor to turn a light on and off.
You know, standard stuff.
I then delivered a demo of the BCI to a group of 50 talented middle school students at Google's CAPE program in 2011. They were learning about microcontrollers, and I think this provided some good motivation for why this might be a useful skill :)
Date: August 2, 2011.
This thumbnail displays the website of the MU ACM Chapter, as I designed it at the beginning of my tenure as Vice President and Webmaster of the chapter (Oct. 2004-May 2005).
As this was the first year that the ACM chapter was "unfrozen" as a student organization, my initial tasks were to recruit new members and ensure that other university and national ACM requirements were met. As of April 2005, our chapter's official status has been changed from "Provisional" to "Active" with the completion of an annual report to the national ACM.
In May of 2005, I became the President of the chapter. My focus for the 2005-2006 academic year will be on recruiting new membership to replace the graduating class, improving the computer science department's pass rate by offering tutoring programs to undergraduate computer science students, organizing fundraisers for the chapter, and ensuring the solvency of the chapter after the current officers (who all belong to the senior class) graduate.
Date: October 8, 2004.
“Develop an appreciation for the beauty in Mathematics”.
That is the motto of the Kappa Mu Epsilon honor society, which honors students and faculty for exceptional achievement in mathematics. As of April 17, 2005, ten days before the presentation of the Quantile Tree (itself an application of mathematics, particularly statistics, to computer science), I am a member of this society.
Criteria for induction are at least three semesters as a "regularly enrolled student", a class rank within the top 35% of one's class, and completion of at least three mathematics courses, at least one of which is calculus, with an average of a B or better.
Date: April 17, 2005.
As a Member
On March 20, 2005, what the student handbook calls "The highest academic honor at Monmouth University" was conferred upon me: I was inducted into the Lambda Sigma Tau honor society, for outstanding academic achievement and service.
As an Officer
Immediately following my induction, I began my official tenure as an officer of this society, though my duties began beforehand. I was responsible for creating the induction brochure, keeping track of the 415 copies of the brochure that were made from my design, and distributing these to inductees and their guests at the induction ceremony.
Following the induction, I performed various tasks for the society, including coordinating community service events and future inductions.
I am also responsible for redesigning and maintaining the official Lambda Sigma Tau website, located at http://lst.monmouth.edu. This site, with the exception of some static content on the homepage, is entirely database driven. Using the backend that I've developed, members can log in to check and update their community service records, officers can modify site preferences and approve community service hours, and the general public can view news and events relating to Lambda Sigma Tau.
There are strict criteria for membership in this society: A GPA of at least 3.5 and at least 65 credits at the time of induction, continuous maintenance of a 3.5 GPA, and the performance of at least 5 hours of community service per semester.
The criteria reflect the motto of the society:
“Leadership, Service, Truth”.
Date: March 20, 2005.
Phi Eta Sigma is a national honor society for those showing academic excellence during their freshman years. The requirements for admission into this honor society are a GPA of at least 3.5 during one's first year and a rank in the top 20% of one's class. Unlike Lambda Sigma Tau, membership is lifelong, with no need to maintain the criteria for admission.
The mottos of the society are:
“Knowledge is Power”
“Lovers of Wisdom”
I became a member of this honor society on November 2, 2003.
Date: November 2, 2003.
I have received two scholarships (despite never applying for any) during my years at Monmouth University: A $6000/year scholarship from the university itself, following my application to the university, and the Dr. Harold Jacobs Scholarship for excellence in science and engineering, during my third and fourth years.
Date: March 27, 2002.
I made the Dean's List every semester that I attended Monmouth University, graduating with a GPA of 3.96.Requirements for the Dean's list are:
Following completion of my B.S. degree in Computer Science (Math minor) at Monmouth University, I enrolled in a Ph. D. program in Computer Science at Temple University. I was awarded a "Round 1 Temple University Fellowship", worth full tuition and a stipend of $20,000 per year for four years, and a "Dean's Scholarship", worth $3,000 per year (the Dean's Scholarship is also apparently an invitation to a society of elite scholars within Temple University that holds guest lectures and roundtables).
Like the scholarships I was awarded as an undergraduate, I did not actually apply for these fellowships; they were granted to me by the department/grad. school.
These fellowships were part of the reason that I enrolled in Temple over arguably better schools (Columbia being a rather shocking example), though the primary reasons were the versatility of the respective programs and the feedback I've received regarding the universities.
Date: April 15, 2006.
This is a paper I wrote (in a single day) on a musical classifier I wrote (also in a single day) that attains 75% classification accuracy between Bach and Beethoven solely from the harmony employed in both composers' pieces.
Though this is a reasonable degree of accuracy, there are some areas in which I can improve the accuracy. Moreover, I hope to create an algorithmic composer rather than a simple classifier (the acronym originally stood for Bayesian Algorithmic Composer and Harmonizer).
You may download the paper or the entire classifier.
Date: May 6, 2007.
During my first semester at Temple, I submitted research work by myself and the research group I work with (DEnLab) to ISBI 2007. I am the first author on this paper because I performed much of the research, ran the experiments, and wrote the paper (the first paper I wrote from scratch).
The paper was rejected by ISBI, likely because our results were initially poor.
However, I revisited the paper over the summer of 2007 and obtained much better results. We submitted the revised paper to IWDM 2008, where it was accepted for publication.
Date: December 15, 2007.
On October 23, 2007, I was invited to join the Golden Key Honor Society for being among the top 15% of students at Temple University.
This makes five.
Date: October 23, 2007.
This paper introduces a high-order data summarization technique I developed using tensor decompositions which is capable of identifying the handedness of subjects based solely on their fMRI scans. I developed the method as part of my dissertation work, and it eventually formed a major component of the dissertation and defense.
Date: April 10, 2010.
This paper, accepted to PAKDD 2010, was the first semi-complete record of the techniques which would eventually form my "TWave" framework. The key result in this paper was a classification technique which preserved the accuracy of wavelet-based classification by subject and task in an fMRI dataset, while executing two orders of magnitude faster (2 hours vs. 8 days!)
Date: June 21, 2010.
This presentation was made by Professor Richard Scherl, my knowledge-based research advisor. While most of the work in this presentation is his, there is a good deal that has resulted from my own work as well. An acknowledgement of this, for which I am grateful, is on the last page.
Date: October 20, 2004.
I began working as a tutor for the Mathematics Learning Center at the university in the beginning of the Fall 2004 semester, after being recommended by my calculus professor. I am the only tutor that is not a math major, much to the surprise of the students (and myself).
I have the unique talent of being able to tutor people in subjects that I've never learned before, picking them up "on the fly" from the student and the problem. All I need are definitions of terms being used.
Date: September 7, 2004.
I began tutoring for the Peer Tutoring Center in the fall of 2004, at about the same time that I began peer tutoring for the Math Learning Center. I tutor students in Mathematics, Computer Science, and History here on a per-diem basis.
Date: September 1, 2004.
My group presented its research at the 2006 Society for Neuroscience Meeting.
Our presentation encompassed six different research topics: application of our novel classifier, known as Dynamic recursive partitioning (DRP) on an Alzheimer's disease fMRI dataset and a study of sexual dimorphism in the human corpus callosum (two projects), an image linearization procedure using space filling curves, a feature extraction technique using concentric spheres, use of this feature extraction technique to segment and classify brain tumors (with a very high degree of accuracy; this really should be employed clinically to assist diagnosis), and an image similarity search utility using wavelets and vector quantization.
I was responsible for writing the actual demonstration and GUI code. Additionally, I was responsible for the two DRP projects and, to a lesser extent, the similarity search project.
Working on these projects is how I learned Matlab. My knowledge probably still has some holes, but I learned the language incredibly rapidly - I was able to program proficiently in it after about one week. The only languages I learned faster than this were HTML (but that hardly counts) and Perl.
Date: October 16, 2006.
This paper was accepted to CARS 2008 in Barcelona, Spain, and will be presented in a poster session on June 25, 2008.
It is on using wavelets to exploit the spatiotemporal locality of 4D fMRI motor task data. The absolute accuracy of the classifier is reasonable at 81%, but it is a 30% improvement over voxelwise approaches. This was also my first exposure to wavelets, and as such, I chose the Haar wavelet for the sake of simplicity. I could likely improve the accuracy even more with a different wavelet function.
Because this was my first exposure to wavelets, I really hope I don't have any mistakes in the background section of this paper. The reviewers did not identify any, but I've found peer review to be next to useless in performing its ostensible function, instead choosing to enforce petty guidelines of presentation.
Date: February 14, 2008.
This is a conference paper published at ISBI 2009 which augments our tree-like structure analysis by automatically discovering branching points. The first author on this paper is Haibin Ling; I am the second. This is fitting; my role in this paper was mostly supportive, although I did provide a slight amount of aid with the experimental design.
Date: January 15, 2009.
Another paper on galactogram classification using topology, accepted to ISBI 2009. I provided "the competing method" in this paper. Angeliki Skoura is the first author; I am second.
Date: January 15, 2009.
Spatial Feature Extraction Techniques for the Analysis of Ductal Tree Structures: By Aggeliki Skoura, Michael Barnathan, and Vasileios Megalooikonomou. Published in Proceedings of EMBC 2009, Minneapolis, Minnesota, September 2 - 6, 2009.
Date: September 2, 2009.
A followup to the prior paper on branching node detection, with an improved technique.
Probabilistic Branching Node Detection Using AdaBoost and Hybrid Local Features: By Tatyana Nuzhnaya, et al. (2nd author). Published in Proceedings of ISBI 2010.
Date: April 10, 2010.