Thursday, March 27, 2014

Week 11 - Sorting

Sorting is organizing a set in a certain order such from smallest to greatest or greatest to smallest. There many are different ways of sorting set such as Timsort, Merge Sort, Selection Sort and Insertion sort. Timsort works the same way as the sort built-in method found in Python itself. Merge Sort works by splitting the list into individual pieces and merging each of these pieces back together into a list that is sorted. Insertion Sort works by starting from the first item in the list and checking whether the next item is smaller or not. If it is smaller, it will be inserted after an item in the sort listed which is on the left side of the selected item that is smaller than it. Selection Sort works by splitting the original list into 2 sections, a sorted section on the left and unsorted section on the right. It moves the smallest item one at a time from the unsorted section into the sorted section. Basically, the sorted section starts off empty and eventually becomes the entire length of the original list and the original list would have been sorted.

However, what differentiates these sorting algorithms aside from how they sort is how long it takes to sort a set i.e its performance. Each way of sorting either via Timsort or Selection Sort or any other kind of sort require a number of steps before a set is sorted. For a n sized list, the performance of a sorting algorithm is categorized as either constant where the number of steps required to sort a list is constant non changing, or n where the number of steps depends on the length of the list i.e linear function or n^2 where the number of steps depends on the square of its length or n^3 where the number of steps depends on the cube of its length or nlogn and logn where it takes the log of the length of length for it to complete the sorting or finally 2^n or n^n where the number of steps required is either 2 to the power of the length of the list or the length of list raised to the power of the length of the list. These categories are in Big-Oh which is a way of representing the worse case or the upper-bound for the number of steps it is required for a sorting algorithm to sort a list. The best type of algorithms you have to is categorized by O(c) which means that the worst case for the algorithm is constant. That means that no matter no long your list, it will only do c number of steps. It is constant in that the number of steps do not vary depending on the length of the list. These type of algorithms are the fastest and most efficient. The worst type of algorithms you want to have is categorized by O(n^n) or O(2^n). These are exponential functions and take a huge amount of steps. A list of 5 items if the algorithm is categorized by O(2^n) would already require 32 steps to solve it vs just only 5 steps for an algorithm categorized by O(n). These exponential functions are really really slow and not efficient. Therefore, you want to avoid algorithms with O(2^n) or O(n^n).

As much as you want to avoid algorithms with Big-Oh of exponential functions, you also want to avoid using sorting functions or any type of methods or functions that are not built-in. The reason is that you are not a programmer and therefore the function you write is not optimized for the platform nor is going to work as well as the built-in methods. For example. the built-in sort function worst case is represented by O(n).where as the other kinds of sort that even the prof wrote such as Selection Sort or Insertion Sort still is only represented by O(n^2). This means that the built-in methods is always going to be better than what you can come up with. Therefore, you want to avoid having to write your functions and methods. It is always better to use built-in ones.

Wednesday, March 26, 2014

Week 10 - A2 Part 2

A2 Part 2 was more straight forward of an assignment. It didn't involve a lot of guess work as to what we could and could not do and what we needed to do and not do because the guidelines were more specific and less vague. Some of the functions, the way we solved them involved using previous exercises and assignments such as using exceptions to indicate whether a regex is valid or not. However, everything in the assignment was something we have dealt with before and it wasn't too complicate to solve. It just took time solve and figure out what each step required and for the recursive functions all the base cases and what to do for certain conditions.

Monday, March 17, 2014

Week 9 - A2 Part 1

The first part of thus assignment for me was really vague. I honestly really had no idea what to exactly do. My group and I had go to through several revisions and complete overhauls of our code in order do create a correct Binary Tree Class. We started off with very little inheritance and eventually realized that most of the code, dependent on how we wrote it could easily be inherited from sub class to another because some of the codes were very similar. Thereby, reducing the amount of duplicate code. That was a hard part of this assignment, reducing duplicate or similar code. By reducing the amount of code and inheriting more of your code, you are able to reduce the chance of bugs and reduce the amount of work you have to do. Reducing the amount of work you have is a crucial and critical part of studying comp sci. The less work you have to do to get the job done, the better.

Week - 8: Linked List

Linked List is essentially a tree that ran linearly and had no depth. It is a list but each of the values in the list pointed to the next value. So if I had a Linked List with values 1, 2, 3, 4 then the Linked List would have the value 1 pointing to value 2 and then it would be pointing to values 3 and so forth. Removing an item from the Linked List was also easy. Instead of shifting the whole list like in a normal list, you would just change which values pointed to what. Let's say we wanted to remove 2 from the Linked List 1, 2, 3, 4. We would just get value 1 to point to value 3 and that's all you had to do. Since nothing pointed to 2, it didn't exist in the Linked List anymore.
This week's lab was really confusing. I did not really understood what my TA was asking of me to do. It was very confusing. Better luck next week.

Saturday, March 1, 2014

Week 7 - Recursion

Recursion, a method of solving problems by breaking down the big problems into several smaller instances of the same problem. These smaller instances are called base cases. Base cases are the simplest version of bigger problem that you are trying to solve using recursion. There many are examples of this when using recursion to solve problems that involve nested structures such as binary search trees and nested lists. For example, say I wanted to find the max of the [1, [2, 3]]. The simplest instance aka the base case of the problem is just applying the built-in method max on a single non-nested list.. So the function would cycle through the original list [1, [2, 3]], but when it reaches to the item [2, 3], it will call itself to find the max of that list which is the nested list. [2, 3] falls under a base case and the max of the list which can be found using max() is 3. The function would then replace [2, 3] with 3. So the list is now just [1, 3] and the function would simply find the max() and return 3.
By breaking down the problems into its base cases, it allows to solve many versions of the same problem if they have all the same base cases. Therefore, this technique is really useful to use because it reduces the amount of code. Many of the big problems can be simplified to its base cases and allows you to be lazy and solve the big problem with relatively few lines of code. Being lazy is of the greatest skills of a computer scientist. Also, the code is really efficient in that it doesn't require lines and lines of code. A problem can be solved recursively simply using 1 line of code. Therefore, recursion is really useful technique.
A prime example of using recursion to solve the problem at hand which was moving cheese across the four stools in A. The stools were a nested list where each stool was list that was inside a big list that holds all the stools. Therefore, this was a problem that could be solved very easily using recursion because there were base cases that could be used to solve the bigger problem. The base cases were moving a cheese from origin to destination, and moving two cheese from origin to the destination. By breaking the overall big problem of moving several cheeses across 4 stools into the base case of either moving 1 cheese or 2 cheeses, the solution of the problem was slowly built up. What I mean by the solution was built was that because you know how to move 2 cheeses, you know how to move 3 cheeses and because you know how to move 3 cheese you know how to move 4 cheeses is which just moving 3 cheese plus moving 4th cheese i.e one cheese from the origin to the destination.

Friday, February 21, 2014

Week - 6: A1 Completed

It has been a busy week for me with many things to do and exams and to study for. But the highlight of the week has to be completing A1. 
A1 definitely made understand recursion a lot better especially working out a way to solve moving the cheeses over the stool using the most efficient recursive algorithm. What I found to be really helpful in solving recursion problems is to visually see it happening. It showed patterns and insight to figuring out the solution.
The concept of trees was a fairly simple concept to understand. On the other hand, unit-testing is still a bit difficult for me code out. I still keep on referring back to previous labs and like looking up the API on how to use certain methods in the class unittest. I think I just need a bit more practice on it especially considering its on the first term test.

Sunday, February 9, 2014

Week 5 - A1

A1 is finally near completion with just the doc strings left which personally, are the most annoying things to write because its so much writing. But it pales in comparison to the amount of work it took to finish everything else. One of the most annoying parts was getting the GUIController to work because I had to spend several hours debugging. I'm not sure exactly what I changed, but somehow after changing like 3 things, everything worked. I'm still trying to figure out what I changed to get it to work because there have been so many instances where I literally change one thing and everything works. It feels good that everything work especially the algorithm for A1.

The tour.py algorithm for A1 is really really annoying because when trying to find the algorithm, the tracking of which variables are which gets so confusing especially once you add more and more cheeses. I have been lost in my tracking of the variables as to which one are representing which. I don't know how many papers I have scrunched up and threw away because I got lost. However, after finally not getting lost, the feeling of accomplishment was amazing. It wasn't the best solution, but party darn close. The key to figuring it out for me was keeping track of source, dest, int1 and int2 meticulously. I drew everything out on paper and then bamn, everything made sense. Never under estimate the power of paper and pen. It is your best friend when it comes to recursion and you have to keep track of variables that change. Happy tracking.