Posts

Showing posts from March, 2020

Journal

Image
On Tuesday, March 24th, we had a meeting with Professor Hassibi over Zoom. Prior to our talk, we had issues with our code regarding the fact that it didn't produce the results expected. He told us to find and graph the eigenvalues of the graph Laplacian, as that would indicate the number of clusters found and value you can refer other datapoints to. He noted that graphing the corresponding eigenvector would indicate whether a cluster had been isolated or not. On that note, it was up to us to tinker with values for a bit and try different algorithms like K-means. Below is a video of the meeting with Professor Hassibi: This week has been very interesting, to say the least. As you may know, COVID-19 has just taken the U.S. for a turn and has put a pause on daily life, forcing businesses to adapt, overcome, or fail in some cases. For us high schoolers, classes have migrated to the Zoom platform. With it followed teachers with bad internet service, leading to indecipherable dialect

Technical Journal - Spectral Clustering for Matlab

Image
       After our meeting with Ethan, Dr. Hassibi's graduate student, we were tasked to implement spectral clustering on the adjacency matrices dataset. In order to determine the algorithm's integrity, we have to use false positives and false negatives. As mentioned, the first step was to clean the data and use spectral clustering on it. We soon realized that the spectral clustering function would only work on R2019b. We downloaded R2019b and it recognized spectralcluster as a function. During the weekend, I did a bit of self-study and tried to teach myself MATLAB syntax which I found similar to Python 3.8. Below is a video that I found helpful. The error displayed in the console when we tried to use spectralcluster on R2019a        The first step is to load in the dataset which is set as a struct, a block of memory with physically grouped variables. Assign observed  and rawAdj  to CAdj  and Adj  respectively to then entrywise multiply the two to Adj1 . We use the eye

Journal

Image
Last Thursday, we met with Ethan Abbasi, Dr. Hassibi's soon-to-be Ph.D. graduate, to discuss our project on semi-crowdsourced clustering. He needed little time to understand our project and dataset which meant that we could get right into possible algorithms. The first one that was mentioned was "Spectral Clustering", one that was extensively researched last semester, and Ethan gave us advice like adding matrixes together to achieve cliques. He also had us think about performance metrics (a way to measure the success of an algorithm). Two metrics we thought of were mathematical representations of false positives and false negatives. In the last attempt to brainstorm algorithms, we brought up min-cut. However, as demonstrated by Ethan, without understanding convex optimization, min-cut would be very inefficient. A photo of topics we have discussed during our meeting with Ethan This weekend during robotics, the programming team, including myself, has successfully im