Wednesday, April 2, 2014

Sorting & Efficiency

Hello again everyone. Lately we've been working with sorting algorithms and efficiency. Last term in CSC108 we worked on searching algorithms for a bit, which are quite similar to sorting algorithms in my opinion. Both searching and sorting algorithms work on finding the quickest way to iterate over lists, so they are quite similar.

We looked into some of the most common sorting algorithms. Namely, Bubble Sort, Selection Sort, Insertion Sort, Merge Sort and Quick Sort. Tim Sort which is the algorithm used in Python's built-in sort method was also mentioned. We compared these algorithms' efficiencies with each other. To compare them we use the big-oh notation of their worst cases.

The big-oh notation is a short way of stating how many steps it takes an algorithm to run. It is stated as O(n), O(n^3), O(logn) etc. n is the size of the input, and in our case(sorting) it is the length of the input list. We determine the big-oh of an algorithm by the number of steps it takes to run an input of size n. For example for a certain algorithm it can take 2n^2 + 32n + 54 steps to run an input of size n. This algorithm has the big-oh notation O(n^2). As you can se we disregard the coefficients and the constants, and only take the leading term, since everything other than the leading term becomes insignificant as n gets larger and larger.

I won't go through all of the sorting types and determine their big-oh notations of their worst cases, because that would make a looong SLOG post. But I'll just give the answers (btw you can find a cheat sheet for these and more here).

Worst case runtime of certain sorting algorithms:

Bubble: O(n^2)
Selection: O(n^2)
Insertion: O(n^2)
Merge: O(n log(n))
Quick: O(n^2)

Yeah I know they mostly look the same but keep in mind that these are the worst case runtimes. But on average cases Merge Sort and Quick Sort are much more efficient (O(n log(n))).

Even Barack Obama has some insight on the subject, check it out here:
Thats all folks. I hope everyone aces all their exams!

Saturday, March 15, 2014

Binary Search Trees

Previously I wrote about binary trees and this week I am going to talk about binary search trees, shortly BST. A BST is a binary tree, but every binary tree is not a BST. That might sound complicated at first, but let me explain.

A binary search tree is a binary tree with the following properties:

  • All the values of the tree on left node should be less than root.
  • All the values of the tree on right node should be more than root.
  • No values should be duplicated anywhere in the tree.
So if we were declaring a class BST that class could be a subclass of a binary tree class, since a BST has all the attributes of a binary tree. 

BST is designed this way because a binary tree arranged this way allows to search for a value on the tree way too easy. All you have to do is start from root and if root is smaller than what you are looking for then go search the right node or vice versa and if root is what you're looking for, then you are in luck. Continuing this algorithm would make finding a value on a really large BST really easy. And this algorithm is called "Binary Search", hence the name binary search tree.

When I look back I realize that this term/course went too fast, in 20 days or so lectures will end and then the exams and all that stuff. I just feel like we started yesterday. I'm not trying to say the course is too easy that it went too fast don't get me wrong. But this is nice, after all it means the summer is coming, yeah I know that is the opposite of the Game of Thrones quote. But wait.. Game of Thrones Season 4 is also coming, right during the exams, but some things have a price I believe. Anyways, I hope everyone is having a nice time, until next time.

Sunday, March 2, 2014

Recursion (again)

Well, I already wrote some stuff down about recursion, but I will add more this week since we have been instructed to do so. Some of my classmates dislike recursion because they find it kind of frustrating. Well they are right to think that way since it is harder to understand or implement a recursive function rather than a non-recursive one. But that is not a good excuse, because sometimes recursive algorithms can be way easier to understand. It might be hard to get used to it but recursion is actually really useful. It makes some kinds of tasks much easier to implement, and makes the code much simpler. And in some cases there is no other way to implement the task(or at least I can't think of any...).  You should to use it when there is a pattern repeated multiple times with small differences between each step. I want to show an example why recursion is sometimes indeed the easier way to do things even though it sometimes looks so complicated.

An example of a situation where it is much more intuitive to use a recursive algorithm would be when implementing a program to calculate the n-th Fibonacci number. For those who never heard about it before, the Fibonacci numbers are a sequence of integers where each number is the sum of the two preceding numbers from the sequence. So the sequence goes on like this 0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55...
I quickly implemented a program to calculate the nth fibonacci number:

 def fib(n):  
   if n <= 1:  
     return n  
   else:  
     return fib(n-2) + fib(n-1)  
     

But if I had insisted on not using recursion, the code would have been much more complicated, while not having any advantage over the recursive code:

 def fib(n):  
   if (n <= 1):  
     return n  
     
   current = 1  
   previous = 1  
     
   for i in range(2, n + 1):  
     temp = current  
     current += previous  
     previous = temp  
     
   return temp  

As you can see, the recursive example is much easier to understand(and much cleaner) in this case.

Friday, February 28, 2014

Binary Trees

Before the reading week we started to work on binary trees. A binary tree consists of several nodes. One of these nodes is the root which is the one on the centre. Root has two children, a left and a right one. The children can also be another binary tree or they can be empty. If a binary tree has no children then it is considered a leaf. If it has any children it is a internal node.

A binary tree can be shown like this:


Since binary trees are stored inside of each other, most applications done on them are easily done using recursion. An example would be traversals. A traversal of a binary tree is a way of writing the tree as a unnested list. As we did in class there are a few different tree traversals. Let's focus on in-order traversal, because I like that one the most since it seems as it is the most natural one.

To make an in-order traversal of a binary tree what you have to do is take the in-order traversal of the  left child of root add the root than add the in-order traversal of the right child of root. It can be clearly understood that this is a recursive function. My way of telling how to do an in-order traversal includes doing in-order traversals in it so there you go. So, the in-order traversal of the tree pictured above would be [2, 7, 5, 6, 11, 2, 5, 4, 9]. It's kind of late, so correct me if this list is wrong, but you'll get the idea.

Trees can be written as nested lists or we can declare them as a class. Traversals are only one thing we can recursively implement on trees, I'm sure we will work on them trying to do more complicated stuff in the coming lectures and labs. Take care!

Sunday, February 16, 2014

Finding the efficient way to move 4 cheeses.

For some time now I, along with my teammates, have been trying to find a way to move the all the cheese from one stool to another. We have been given a way that actually works and it goes like this:

  • For a cheese stack of 1 cheese round, simply move the cheese to the destination.
  • If the stack has n cheese rounds (n > 1), we need to think of a number i which should be between n-1 and 1. Then we have a three step process for moving the stack:
  1. Move n - i rounds to an intermediate stool.
  2. Move i cheese rounds to the destination, using the only three available stools.
  3. Move the n - i rounds of cheese from the intermediate stool to the destination stool
Okay, so we have a nice way of doing things which can easily be applied recursively into a computer program, so far so good. But there is one huge problem keeping us from doing that. That problem is the number i itself. How can we find the most efficient i on each step of the recursion? 

This is the hard part of this assignment. The way I coped with this problem was quite painful, but it worked. It took a handful of unsuccessful attempts and a few hours spent trying various different ways to solve this problem until I found a good way to solve this.

What I did was, I tried to move the cheese manually(okay not real cheese, just crappy illustrations on paper) and record the i for every n which gave the most efficient number of moves. After a certain point I tried to find a formula to get the i. I got a few different formulas which gave close enough i's, but after a few of those attempts I figured out a formula that worked. I don't actually know if it worked for enormous numbers but it worked at least up until 25 rounds of cheese so, I just went ahead and used it.

The deadline is already passed but I don't know if I am allowed to give the formula out. Anyways, have a nice week!

Sunday, February 2, 2014

Recursion (week 4)

This week in class we learned about recursion. I think recursion is kind of hard, because to understand a recursive function, you need to think as if you were the computer and calculate what each step does. This process is called tracing. Tracing code is necessary to understand it, as well as to implement new functions involving recursion. It's kind of painful and energy consuming to do it but I suppose I'll get better I get more familiar with it. I just need to exercise more, just as in anything else.

Other than being hard I think recursion is a cool concept. It is a nice way to overcome some programming challenges. It's basically telling the computer to do something, until something we specified earlier changes, it is kind of like while or for loops we already use but it's very different. Calling a function from inside its own body seemed kind of odd at first but it is actually a brilliant way to do things. Well, I can't guess of any examples which can only be done using recursion but I am certain that it will become handy in many tasks awaiting us. Have a good week.

Sunday, January 26, 2014

First post(week 3) and OOP!

Hello fellow 148'ers, welcome to the first post of my CSC148 SLOG. I'm going to be periodically posting here about how my 148 lectures are going and what I'm enjoying and learning during the process. This is the third week of lectures, and we mostly learned about exceptions and inheritance this week. I had some knowledge of the concepts from the AP CompSci course I took back in high school. Well, that was more than 2 years ago, I learned about these stuff using Java, and frankly, if I was asked to demonstrate my knowledge right now, I wouldn't be able to move my finger. But nevertheless, while some of us are learning these concepts for the first time, I am(and probably some other classmates like me) actually benefiting by learning how to implement them in Python and cleaning some dust of my programming skills we have yet to use in first year CS courses.

I'm particularly excited about inheritance and digging deeper into the object-oriented programming world. Although it can sometimes be frustrating to deal with all those objects, classes and subclasses(particularly forgetting to add the keyword "self" EVERYWHERE), I think OOP is a really nice thinking structure. It can actually be applied to numerous real life situations, and by real life situations I don't mean real life coding challenges. I mean virtually any system that has a working structure can be thought of as if it were an object-oriented program. Actually this makes it easier to apply in programming, because it doesn't require you to think of it as an abstract concept, but it enables you to think of it just as you were solving an everyday problem. Well I'm writing all this while I study for an economics midterm and believe me, multiple times today the thought "I'd rather study CS than doing this" had crossed my mind, and it probably will again until the test tomorrow night, but I guess I'll have to deal with it. Make sure to check out my later posts, until then, I wish we all have a nice time struggling with CSC148!