Sunday 6 April 2014

Farewell CSC148

Well this is the last "SLOG" I'll be writing for this course. Whether or not I plan on continuing this will depends if I remember next year. We didn't learn too much this week since we spent a big chunk of the time getting our tests back or going over review but we did learn one interesting thing about using dictionaries to get a linear search algorithm. The idea is using the function "hash" to create keys for a dictionary and store that data normal with that key. This way, the search just see's if that key is there which results in a quick search as it only needs to find whether or not that key exists.

Hmmmm... I guess there isn't too much more I want to talk about now... 1 week to study for exams... I'll probably end up playing games with my friends one for one of these days to relieve some stress. Anyways it's the weekend, off to play some League of Legends!

Monday 31 March 2014

Sorting and Efficiency

Well this one's a bit late but here it is... sorting and how efficient they are. In all programming, how efficient the sort works is how compared to how long it takes to sort a set amount of data as the amount of data increases. This is important of course because in real life situations, there could be big companies that need to sort payrolls and those could contain millions have entries that need to be sorted constantly. If you had an inefficient sorting for them, it would take ages for it to process everything. The most basic sorts that we looked at were all either in the worst case scenario, n^2 or nlogn run times. These sorts were the bubble, insertion, selection, merge and quick sorts. Basically what this means is when you run one of the sorts, the time it takes to sort the data will grow proportional to how much n has grown squared. These begin to take a fair amount of time as n gets really large so using merge sort for extremely large lists n is much better since it has a run time of nlogn, which is much less than n^2 for large n. Basically the key idea is that you want to have more efficient sorting algorithms because the bigger the list gets, the less time it will take. Big-O is looking at the worst case scenario of each sort and determining their run time and how it grows for extremely large lists n.

Tuesday 25 March 2014

Runtime Analysis

This one's a bit ate but last week, we started looking into Big-Oh and the run time of different sorting algorithms on different sizes. He first introduced the quick sort which I learned in high school how it worked but never programmed so it wasn't to difficult. It more or less chooses a value in the list and sorts all the values smaller than it to its left and all the values bigger than it to the right. This recursively calls itself until eventually, you are left with a single value list and therefore "sorted". Merge sort works in a similar way but instead of separating the smaller ones and bigger ones to each side, you just split the list into halves regardless. After it is broken down to its single value, the individual values compare themselves to one another and arrange themselves in a new list depending on which is smaller. Both sorts have a run time of about nlogn in their best case scenario where n is the number of values in the list but the problem with quick sort is that if the "pivot" you chose doesn't split the list up into 2 equal sizes, the run time is actually worse, going all the way to n^2.

Anyways that's about all I wanted to talk about for now, spending some time now finishing up making my cheat sheet for Wednesday's term test... hopefully goes well. I'll probably do a more detailed summary of what we've done sometime later this week.

Monday 17 March 2014

Longggg Catchup

Ohhhhh boy been busy doing assignments / studying for all the midterms, completely forgot about this thing until it was mentioned in the lecture... Alright let's sum up about what we've learned so far in the lastttt I think 2 weeks or so.

Alright I believe I left off at LinkedLists and we began doing it the way I learned before. A node and a tail pointing to another node until a node eventually points to a None. Midterm happened after that and to be honest it wasn't that bad. Got it back the next week and lost a mark because I called super(...) instead of Stack.__init__... oh well still did well. Alright anyways back to LinkedLists and its methods for the head and tail. The insert and deletion were fairly simple as we had to program them ourselves back in high school. Now sadly I missed the Wednesday lecture that week due to family issues so I still need to catch up on that.

Moving onto the next week, we learned about the Binary Search Tree where it has a node and a left and right side. The lab for this week was fairly interesting, using some more recursion for BST and creating helper functions that re-curse on themselves. With the code that was given to us, it was fairly easy to construct the new helper functions by doing something similar to what was already coded. Now for what we're finally moving into and what I've  been dreading... sorting algorithms and runtime algorithms... I've been dreading this moment since I took CSC165 last semester and oh boy it's back to haunt me. Luckily I still remember a lot from that course so big-O isn't exactly a new concept to me but man did I hate it.

Well this was a quick summary of what we've done so far for the past few weeks, still 2 more midterms to go, one of them being this course... hopefully it goes well.

Sunday 2 March 2014

Recursion... and Some Other Stuff

Well... apparently we're supposed to write a paragraph about recursion this week so let's get on with it. The idea behind recursion is to keep on repeating a function until a "base case" is reached. For example the most common recursive method we used was calculating the sum of a nested list. The function would keep on running itself as long as the next element was a list and eventually, you would reach a case where instead of a list, you would hit a numerical value of some kind. That numerical value would be the base case in this situation and the method would begin going back up the recursive steps you took to reach that base case. The idea of a recursive method is that you can break up a very complex problem by breaking it down into smaller parts and analyzing them and then moving back to the bigger parts while knowing what the smaller parts it consists of will return. We used recursion recently in our trees as well by analyzing each node and its children until we eventually reached a node that did not have any children. That is when we knew that it was time to end the recursive calls and begin going back up the recursive steps.

That's enough of recursion for now, time to go over what we learned in the lectures this week. This week we went over LinkedLists and it's slightly different then how we did it in Java. In Java, we had a head and a tail, similar to how we're dealing with LinkedLists now but instead of just pointing to an empty LinkedList object, we had the last item in a LinkedList pointing to a null. We also did something in Java back then called a Doubly LinkedList and it had both a ending null but also a head null so the LinkedList could be read both ways. The slight difference with what we learned now is that a head has 1 value and everything that is left is in the tail. In Java, we had a node pointing to a node, pointing to another node until that last node pointed to null. It's about the same thing but it was a bit confusing at first for me to see how the tails of the LinkedList worked.

Saturday 22 February 2014

Assignment 1 Wrap Up

2 weeks or so have passed since the last SLOG since I went on vacation with my family and sadly had no access to a computer to do this but anyways here's my SLOG about what has happened so far.

Assignment 1 was due recently and man was it tough. The first few parts were pretty easy as it was just creating methods for the TOAH GUI to play properly but when it got to the recursion part, I was stuck for a long time. I wasn't exactly sure how to approach it but after a bit of digging around and getting a bit of help from the TA's, I learned that the best way to deal with the 4 stool recursion was to use both the formula given to us during the lecture for solving for i in recursion and using that with what we already knew about how the 3 stool recursion worked. Overall the assignment was fairly challenging but now I also see how powerful recursion is.

Now moving to what we learned in this week, we finally began looking at tree structures. We learned a few terms that correspond to what a tree is such as a node but what a tree is more or less is a set of nodes that only have one "parent" node or a node that it branches from. Each path down a tree is unique and no nodes will have children that are connected to nodes further up in the tree or in other paths. We also looked at a few methods that go with trees such as finding the height of tree or longest path length and how many nodes there are in a tree. Tree structures are a great way to see where something came from, such as a program having a "parent" program where its functions are derived from and this helps with tracking down what is happening in a program as well if you trace the path down each node to see what function they are adding on.

Well, that's another week of CSC148, only 2 more midterms to go... and one of them being CSC148. Hopefully I'm prepared for it and do well. There's gonna be a lot of gaming after those 2 are done...

Sunday 9 February 2014

Even More Recursion...

Well another week of CSC148 has passed and we did even more recursion in both the lectures and the labs. Not saying it's a bad thing since recursion is a very powerful tool when it comes to programming so let's get a little detail on what we did. In the lab, we looked at some more codes involving recursion and have to trace the code running on 3 different cases. The interesting thing was that in each case following, the previous case appeared and so we just used that value and subbed it in. Afterwards, we actually created our own recursive code and it was a good experience creating our own base cases. The first one we had to create 2 bases cases, where either it would be the value we were looking for or the value is a null. This took a while and I needed to ask the T.A. for a bit of help but it wasn't that hard once he told me about making the 2nd base case since I had checkers in my previous code.The second one was just creating a list from another list and the base case was just when the value was not a list so this example was easy and was quickly finished.

Now in our lectures, we looked at a similar example to the Tower of Hanoi but dealing with cheese. In High School, when we were dealing with recursion, not only did we look at the Tower of Hanoi, but we had to create a program similar to how it worked. The way that the professor taught us in the lecture however was slightly different and seemingly much more simpler just moving everything on top of the bottom piece to the middle and finally getting the bottom to the far side just by using constant recursion. It was pretty confusing at first but after reviewing it a bit more and listening to the professor explain how the recursion in it worked, it made a lot more sense. Afterwards we looked at naming and where python looks for the names of a variable or method. We worked with private and public variables in Java and it's slightly different in the sense that I don't recall ever having declared something was global or non-global since public and private did that. Other than that, the tracing of where "spam" in the lecture came from was pretty easily understandable to know whether or not "spam" was passed to outside of the method "scope_test".

Well this ends another SLOG for CSC148... looking forward to the reading week break but first... I have to worry about assignment one and the upcoming midterm for chemistry...