Friday, December 7, 2012

Google+ Communities

This was inevitable. And I think it is a bold step forward into something awesome. Google+ communities. Could this be the next big circle? In many ways, yes. In some strong ways, no.

I strongly suspect the reason for cuommunities to happen within Google+ is based on how people use Google+ right now. Since most people don’t really have the full gamut of real life friends there right now, a lot of the activity is happening around community based interests. For example, at least once a week I see someone sharing a circle of ‘scientists’, or ‘googlers’, or ‘entrepreneurs’, or ‘developers’. You get the idea. Most of what I’ve posted is tech stuff. And now I mostly post to my developers, and ‘geeks and techies’ circle. More importantly, when you add someone to a circle, you see all their posts. Not just their posts on the particular interest you followed them for. Not so with communities. People sharing to a community share to it because they want to talk about their interest. This is why I pondered the question, could communities be the next big circle. After all, Google lets you share directly to the community only.

But what about the strong ways it isn’t the next big circle. Here’s the problem. When it comes to our circles of people we’ve found in an interest, we know who we’ve got. We know we’ve got passionate people and not fanboys. Even when we take a shared circle, if there’s noise in there from certain people we can remove them from the circle because it’s our circle. And that’s the key point. Our circles are for us. A community is, well, a community. Like it or not, the Dota2 community is going to pick up LoL fan boys and they’ll come along to troll about how the hero that’s been adapted from Dota 1 is just a copy of their precious LoL hero (while in fact their hero too has been adapted from Dota 1). And when that happens, there’s nothing you can do about it. Sure, the community manager can kick them out (I hope) but in the end, trolls come in greater numbers than community managers. Private groups isn’t an answer to this. It may be really high quality content, but the openness of having someone unknown come in to your world and discovering great stuff and in turn sharing more awesomeness will be sacrificed.

Thus, depending on how the communities feature continues to evolve, it’ll be interesting to see whether it becomes a place of continued awesomeness, or a long thread of YouTube like comments

Saturday, September 29, 2012

The Past Weeks. Building a Multi Threaded Scraper

I’ve been silent for the past bunch of weeks. No, this isn’t one of those ‘I’m back to posting’ blog posts. This is just telling where I have been and what to expect very soon if I’m allowed to actually post about it which I really do hope I am since the communities out there actually helped me so much in solving the problems. When I say communities I mean other people’s blogs and such. Which is really awesome when you consider that the information out there was so vast that I didn’t have to go to a single #python irc channel to get more information. That and the documentation on python is brilliant.

So, what is it that I’ve been so busy with over the last couple of weeks. Well, anything.lk has gotten a partnership of sorts going with a business to help increase the number of items we can supply our fans and loyal customers with. Why? Because we believe that so much more can be done in this business. (Sorry but I can’t divulge the details just yet. That’s another reason why I’ve been so silent on the work that I’ve been doing). But how many items you ask? That’s a good question. The answer is anywhere between 2 million to 9 million items. The only problem was, due to the size and the nature of the partnership we would have to do the heavy lifting in order to get these products to us. Since they run an online site, the best way to do this and to keep our list up to date with their one would be to run a scraper across it. Not a crawler. A scraper.

Building this wasn’t easy. The tools I used were python and BeautifulSoup. During my first na├»ve implementation of the scraping I was taking about 2-3 seconds per item to be scanned. If I was to run the scraper across 2 million items that would take me 57 days at 2.5 seconds per scan.  Not good. The next thing to do was to build the scraper in a multi threaded fashion. At around 100 threads working I discovered the sweet spot between enough threads and speed and left it at that. The scrape time was brought down to 6 days. From that point onwards it was running a few trial scrapes. What followed was me discovering just how memory intensive this was going to get. In under 10 minutes I was eating into 3+GB of RAM and the usage wasn’t slowing. Several hard crashes later I realized that I would have to severely cut down on the number of objects I could have in the memory. I limited to running an insertion into the database every 100 records, and thus brought down the memory usage to just 70 MB RAM max. This was I think one of the biggest highlights for me personally. From there, I had to further optimize how the connections were taking place. The problem that I had before was that I was waiting for 100 Queue objects to be cleared before dumping the next 100 in. Why? Because I had made a bad design decision to not put a cap on the number of threads being used inside the scraper. So if I didn’t do a q.join() every 100 items I would have been in serious trouble where I’d be spawning 2 million threads in a matter of seconds. This was mostly because I was using both threads and Queues for the first time so I wasn’t too sure about the logical decisions I should make till much later.

Thus, the next decision made was to limit the number of threads, and keep a maxsize on the Queue so that it didn’t grow to two million objects either all at once. This way, as soon as a page was scraped the next page was slotted in instead of waiting for 100 to finish to get its chance. The result of this? 6 days to 3 days used for scraping.

And that, was what I was doing the past few weeks. I haven’t spoken about all the other little fail safes that I had to put in. There were a number of failures from the site being scraped and the internet connection at the office that I had to factor in. The scraper was launched yesterday in the evening. I have no idea what’s happening with it right now. But if it has for some reason broken due to bad coding I’ll need to build in another fail safe for starting it up and getting it to pick up where it left off. That’s one thing I haven’t had time to build given the constraint that the scraped items need to make their way to our site pretty soon. Pray I shall then I guess.

Monday, September 3, 2012

Quick Post: Passing Parameters for an SQL Statement with IN clause using SQLCommand (C#)

There are probably a variety of ways to solve this little problem as this Stack Overflow thread will show but I thought that showing how I solved this would still be worth it. All I do essentially is build a string, and based on how many parameters there are to insert into the SQL statement I insert the ‘?’ with a comma after it and once I exit the loop I insert one more ‘?’. Note that the number of times I run in the loop in one less than the actual length of the parameter array. Once I’m done building the string I then iterate through the parameter array and add each parameter in. This code is tested and working.

StringBuilder selectstring = new StringBuilder();             selectstring.Append("SELECT ROWID, * FROM tbl1 WHERE ROWID IN (");             Int64[] arr = {1,2,3,3,4};             for (int i = 0; i < arr.Length-1; i++)             {                 selectstring.Append("?,");             }             selectstring.Append("?)");                          sa.SelectCommand = new SQLiteCommand(selectstring.ToString(), conn);             foreach (Int64 a in arr)             {                 sa.SelectCommand.Parameters.Add(new SQLiteParameter(DbType.Int64, a));             }                          System.Data.DataSet ds = new System.Data.DataSet();             sa.Fill(ds);             conn.Close();

Saturday, September 1, 2012

Doubles and Decimals in C#

While I say C#, from what I’ve researched on floating point types so far, this concept could apply to most languages.

If you’ve ever programmed using double type values in your program I can safely say at this point that you are probably doing it wrong. You may never notice it but doing work with doubles is at some point going to give you an error that you may never notice until the day some critical function is supposed to happen based on a value and then you find the critical function isn’t getting triggered. Murphy’s law. Believe in it.

Where does this spawn from? While building a system for very very basic data analysis, I needed an even more basic addition of numbers to ensure that the total of values in a column was equal to 100 before allowing the user to progress to the next step. I was testing the program out and everything seemed fine, and just as I was about to roll the system out, I had a few more values to change in the database. Instead of doing it manually I thought I’d test the system again and do it through there. (Why is this important? Because it’s incredible where some things can go unnoticed through several hours of complete testing). While all this time I had been adding whole numbers, in this case I needed to actually add decimal numbers. Somehow, even though a basic on paper calculation gave me 100, the program was saying values do not add up to a 100. In debug mode, I discovered that at one point, 22.4+23.7 was giving me 46.9999999939. What. The. Debug??

Reading up through the documentation and several threads on the internet enlightened me that double values aren’t meant to be precise. They are meant for speed. They can be affected by the strangest things such as directX interfering with the bits. It’s scary, but it is what it is. The solution? Use decimal types. Almost always, you are going to use the slightly heavier but absolutely precise decimal types. DO NOT use floating point type variables.

But that begs the question. What are double types useful for? Sure, they are faster but where would you use them? Essentially in anything that needs speed while allowing for a very small percentage of error. Redrawing sprites based on their screen positions can use double to store the vector coordinates. Believe it or not, extremely large dataset mining, to get a general trend is a likely application. But essentially, for us who write applications for everyday use, we need precise values and therefore keep it in mind, decimal types.

Monday, August 27, 2012

Accessing MySQL Remotely With MySQL Workbench

Command line be darned. Visual tools are there for a reason and if you honestly find using the GUI to be easier in visualizing complex queries then you should use it. It also helps in saving time when you want to scan the database to see what’s in it. But that’s not really the point here. A few days ago I began developing some major systems for internal use at the workplace and it was finally time to let the SQLite databases go. Not so much because of data storage needs but mostly because of the need for SQL side validation of the Foreign Key constraints and also to ensure that I didn’t have to do a lot of extra work to ensure the data integrity stayed intact. But I digress. The problem was that I needed to connect the desktop application and MySQL Workbench as well to the MySQL database that was on a server and there’s really nothing on the internet that actually addresses this problem directly. The best ‘alternative’ to this is to actually use a web service to send the data back and forth. Since I’m not going online for now it’s not really needed for me to actually be transferring data like this.

In order to this I got my own virtual server setup using Ubuntu Server (ooo he’s taking the easy way out. No I’m not. Wikipedia uses Ubuntu Server. I’m using the best tool for the job) and my way of access into it is through SSH. For the record I have no idea how I got SSH into powershell and I suspect it happened at some point while I was installing the libraries for cygwin. Anyway, after SSHing into my server I checked around, discovered that the network admin had already installed LAMP and phpMyAdmin and thus my MySQL instance was up and ready. This would end up causing more problems for me than I had anticipated. At this point I’m not willing to go reverse all the steps I took to find out which ones worked completely right but I know which ones are absolutely necessary and can give options if the steps don’t work properly.

So the first thing you want to go do is actually read the manuals on how to create and manage user privileges. I’m in a bit of a rush here so I’ll add the code later but the main steps are as follows.

First up, create a user apart from root who has all privileges. Later when you learn the full privilege list then you can actually revoke what you don’t really need but for now I’m not entirely sure of what I need and don’t so I granted all privileges. The code went something like (no I’m not being at all precise over here.)

CREATE USER newusername IDENTIFIED BY ‘type your pass here with single quotes’;

GRANT ALL PRIVILEGES ON *.* TO newusername IDENTIFIED BY ‘your pass word’;

FLUSH PRIVILEGES;

If you read the manual you’ll find this creates a user that can basically be accessed from any host. The reason for me wanting to do this is because I’m going to be needing a user that can be accessed from any machine inside the company as I’ll be making a desktop application that needs to access the db.

Exit the MySQL server. The next thing you need to do is turn off MySQL accepting only local requests. For this, open up the my.cnf file found under etc/mysql/ using sudo vim my.cnf. What you want to do here is comment out the lines bind-address = 127.0.0.1 and skip-networking. Easy way to do this?

:%s/bind-address/#bind-address/g

:%s/skip-networking/#skip-networking/g

And that’s it. I think we are ready. I did go to the extremes of forwarding the 3306 port using iptables. This is the only thing that is really server specific and you’ll want to refer the manuals of your particular distro. I don’t think this step is necessary so skip it for now but in case the actual step of accessing the db through the workbench or app doesn’t work you’ll want to come do this (or the equivalent of this if you aren’t using Ubuntu Server)

sudo iptables –A INPUT –p tcp –dport 3306 –j ACCEPT

sudo iptables –A FORWARD –p tcp –dport 3306 –j ACCEPT

sudo iptables-save

Hopefully this step works without needing the iptables step above.

It’s time to connect MySQL workbench to the db. Here’s where I made the biggest mistake. I assumed that since I actually connected to the server through SSH, I should use that method to connect to the db when using Workbench. Turned out that I was wrong. Or at least, not wrong, but it turned out that after all of this, using a standard tcp connection worked fine. Give the server name as the ip address of the server you are connecting to. XXX.XXX.XXX.XXX that kind of thing. Port should ideally be 3306 (by the way if you don’t think your mysql instance is running on port 3306, highly unlikely as it may seem, just type mysqladmin version into the command line of your ssh session and check the results. There’s one that says port. That’s your port. If your port is different then change everything to match it. Doh!)

After you put the ip, put the username and the password that you created and test your connection.

You’re welcome.

And that’s how you connect a desktop based application or a MySQL Workbench to a mysql database that’s on a server.

Sunday, August 26, 2012

Liveblogging tools: Begging for Pricing Disruption

I know I said I would post on the conversation I had with the compere of the Etisalat event but there’s something that I need to get out of my head after an experience I had today. There was a time that I would go to tech events, live stream events, and update my blog through a live blog plugin. When I started out there was an EXCELLENT albeit ad supported tool for live blogging which was cover it live. Unfortunately, they discovered that free wouldn’t cut it and went to paid while leaving a free tier that has some strange restrictions on how many user actions can be performed on the live blog. That strikes me as a strange thing because that might mean that my live blog is not permanent. Once I go above the threshold for a particular event it’s shut down and I have to pay to ensure that it stays visible to future visitors of my blog.

So then I decided to look for some free alternatives out there. The main ones that I came across were the Wordpress plugin for live blogging, a site called Blyve and Wordfaire. There are many alternative sites though I believe that ScribbleLIVE and CoveritLive are the only two ones really worth considering.

What’s wrong with the other ones? Wordpress plugin is not really a liveblogging tool. In the sense that stuff doesn’t get pushed out to the viewers. It gets polled which isn’t the best solution if you are hosting it on your own server. The second option for that is to host your own meteor server which handles the pushing to the viewers but again, live blogging isn’t just for techies and therefore, the solution should not be tech intensive either.

Wordfaire is nice, but it’s in beta, isn’t all that feature rich and the worst part is that the embedding features are pretty bad. Not only do you have to customize your embedded live blog but as the event goes by you won’t find all the messages in it. It shows only a certain number of messages after which if you want to see the rest you have to visit the Wordfaire site itself to see the full list of messages. I imagine this is for advertising purposes but then that’s why I don’t like the idea of completely free either.

The best alternative I have found is Blyve. It isn’t almost as comprehensive as CoveritLive but it comes really really close. In the free tier you get 500 uniques per month. For a blog that sees only about double that activity for the entire blog for the whole month that seems like a pretty good deal. But the problem again is, what happens to the day when my visitors become substantial for my blog posts, but the number of live posts I do isn’t enough to justify paying a not insignificant amount monthly to use a live blogging service that still offers limitations on the number of actions/uniques per month it can serve?

The Per Instance Pricing Method

From everything that I’ve said I’m willing to bet that between those who pay monthly and those who use the free tier is a set of customers that are willing to pay some amount but use the free tier simply because they can’t justify paying a full monthly cost. What if any one of the above live blogging companies (I’d vote for CoveritLive and Blyve) came up with a model where people could purchase an instance of the liveblog for that particular post and pay a certain base amount based on the traffic that they expect to receive. If they receive substantial traffic after that they would receive a warning to pay for the next tier for that instance. This is unlikely to happen because unless your post is a really special event with global interest that gets voted to the top of reddit and hacker news, the traffic that you’d get would be fairly easy to estimate. So, step by step here’s how the payment would work

  1. I need to host a liveblog for this month’s Refresh Colombo. I visit the liveblog site and pay $4 for an instance of the liveblog which can host up to 500 unique visits for the duration. $2 for every additional 200 uniques I expect.
  2. The live blog is available and life goes on.

But of course what would happen once the event is over? If it’s a one time payment then the liveblog host bears a cost to keep that viewable in their system right? Here’s the cool part. Once the liveblog is complete, offer a snippet of HTML where all the content from the liveblog gets hosted across in the my site. That means all I have to do is copy that HTML and replace the iframe embed code in my site once the event is complete. This isn’t too tech intensive to be a problem and would solve almost all the problems for both parties. What problem does it not solve? The hosting of the pictures. If I want to host my pictures on CoveritLive or Blyve then they should charge me on a monthly basis OR better, move it across to Picasa or Flickr for me and give me the new HTML code which links to those pictures automagically. Boom.

This serves two main purposes. One is that I would be able to have a pricing that fits my needs and I’m sure, the needs of many people out there. And on a second equally important note, I would have some form of ownership of my data. Maybe the service doesn’t have to be Flickr or Picasa . Maybe they could offer to let me download the pictures so I can upload it to my own FTP if I’m at that level of tech savviness. And if they’ve named it right (for example, according to the time each picture was uploaded relative to the liveblog timeline) then I could simply do a find and replace to replace their URL with the base URL of my FTP.

This probably seems a little too complicated but at it’s most basic level, I pay for an estimated number of users, I get a new bit of HTML code to embed and I get to keep my photos for free in services I already use, or pay a small fee to let the live blogging company host it for me.

C&C is welcome.

Friday, August 24, 2012

Quick Post: Solution for YouTube Videos Not Loading While Paused

Play a game while you wait for your video to load sir

This is probably not something new for most people but it’s been something that’s bugged me for a while. I’m not on a fast net connection at home and when I watch YouTube videos I usually pause it and leave it to load. As of recent times I’ve noticed some videos not loading while being paused. Which really sucks. It’s not a big problem in the sense that I workaround it by letting it play in mute while I do something else but it’s a problem nevertheless. I don’t know what’s causing it but I do seem to have found a decent solution.

After searching on Google I found two Google group posts which led me in the right direction. The first was a confirmation that I wasn't alone. And the second one had a solution from a Googler. The solution? Change the quality of the video. Now you obviously don't want to do this while it is loading so ideally you want to do this right at the start which is what I did and I can confirm it works. What I did was to switch from 360p to 240p at the start, wait for the video to start playing and immediately switch back to 360p. Maybe it's my imagination but the loading seemed to be much smoother after that as well. Hope this helps

A Talk With an Etisalat Rep and Some DC HSPA+ Perspective

I’m not entirely sure I should call Abdul a rep but let’s just say right off the bat that rep is purely a term that I have given him. And like I mentioned during the live stream, I would relate most of the stuff I spoke to him about. It’s not a lot but it was insightful although there’s still an empty spot I need to fill by giving a call to the Etisalat hotline. Shame on me for not doing my research. First things first,

A quick recap of DC HSPA+

At the time of writing this post I’ve had the chance to be exposed to two presentations by Etisalat on the same topic, tested the new connection in more than one scenario, two locations and I think that makes it a little fair to give a small commentary and summary on what this is all about. Essentially, by allowing two simulataneous connections to originate from the same source the speed that one can achieve gets doubled. Both the practical and the theoretical speeds. But no one cares about the theoretical speeds right? Caveat though, there are three requirements that need to be fulfilled to achieve the new speeds. First up is the dual carrier compatible device. Second, is an ISP with the infrastructure to provide the speeds without choking the network. And finally is the capability of the servers you are contacting (ex: YouTube’s) to serve you at the max speed that the device is capable of.

The Rationale

When speaking with Abdul I was curious as to how they would be marketing this package. Let’s face it. Broadband is good enough to stream videos without a problem. YouTube videos at 720p and above can give issues but up to 480p is fine and honestly, that’s really good enough usually for most fail + cat videos. Even for the olympics, 360p was absolutely fine for a 21 inch screen. So why would most people need double the network speed at quite possibly more than double the price?

The first answer that came through was that this was being targeted as a family package kind of thing. This was in fact reinforced during the presentation at Refresh Colombo when the presenter mentioned that families would be able to share this connection without experiencing a drop in quality of their individual experiences. This also made sense with the fact that in the slides and the promotions the Etisalat groups were carrying around DC HSPA+ compatible MiFi units.

But that gave room to the question of corporate packages. Corporates don’t seem to be amongst the main target groups for this kind of thing based on what I understood since they are more based on the fixed line connections. There is in my opinion another avenue for this tech in corporates which is the small (like <10 people) businesses starting up these days. Connections like this would be ideal for ad hoc free lance partners to have fast internet without being burdened by fixed line issues. Of course, I think I’m stating the obvious here but I just want to open it up for discussion.

Pricing & Concluding Thoughts

When technology isn’t being geared towards the individual you have to imagine that it isn’t going to be cheap either. After all, it’s for the group and therefore the per individual cost may stay the same. Based on that, I’m guessing since these packages are probably aimed at groups of 3 people and more the price should be roughly 2.5x that of any comparable package. The equipment should also be about 3-6x more expensive. The question though is whether or not it’s worth it. If the internet works as advertised, I’m inclined to say it is based on how much data is included under each tier. Looking at what Etisalat has right now, the Rs. 1,500 package gives a user 12gb before requiring extra payment for each Mb (20 cents per Mb). SLT gives a user 25gb  at 8Mbps for that amount. To get to 25GB you’d have to pay an extra Rs. 2662.40 to Etisalat under the current connection. Add that to your SLT bill and you would be Rs. 700 away from the Web Pro package that gives 60GB at 8Mbps.

Speed does matter, but with speed and plans to create family oriented packages I think Etisalat would have to choose to get rid of their existing packages and tailor some new ones since all that added speed is going to amount to people completing their quotas really really fast. 12GB is honestly nothing at all. My smartphone usage is 25% of that per month usually so one can imagine what my standard internet usage is like. And as for SLT’s quality of service, beyond the FUD you see on the internet I’ve actually been hearing good things about their newer packages which means that in a battle for pricing I’d still not go with Etisalat. Of course one could say this is Apples and Oranges but given that it’s a family package oriented thing I don’t think the fact that I’m comparing fixed line vs mobile broadband really comes into play here.

The one other concern I do have of course is coverage. I had a chance to play with an Etisalat DC HSPA+ connection at Refresh Colombo yesterday and the maximum I could pull from it was 0.3 Mbps!! That’s ****!! But to be fair we had some crazy rain but then again the rain had died down pretty much by that time so I don’t see how that really works. After all, my dialog dongle was clocking 2Mbps before I knocked it out of the USB slot thereby ruining the rest of my test.

So there you have it. A full evaluation of the Etisalat DC HSPA+ ‘initiative’. In summary, the speeds are real when they do work, the applications for individuals apart from journalists are minimal enough to not make the jump and the charges that could surround this are also a little doubtful BUT I will not make a final call till I call the hotline and hear what they have to say. So, there will be an update to this post but for now this is it.

Wednesday, August 22, 2012

Refresh Colombo August Meetup

It’s been a while since I blogged a refresh colombo meetup so this should be fun. I’ll be giving one of the three presentations which I’m really looking forward to. For the uninitiated, Refresh Colombo is a monthly meetup open to anyone interested in tech. And when they say interested in tech it can be from any angle at all. Not just the deep in tech programmer level style. Like the site says. Bring anyone you want with you as a guest. Even your grandmother. Yes. The sweet lady who takes pictures with the iPad you gave her to stay in touch with you. Jokes aside, this time’s Refresh Colombo is looking to be awesome and I am just as pumped up about the other two presentations as I am about my own. What’s on the topic line?
I’ll be presenting on Building Software Products Anywhere. Since joining Anythng.lk I have experience a creative high like never before in my life and I am building and rolling out more products over time and learning more about good software than I ever have before. This is strange because my original Job description doesn’t really call for anything related to programming. And more than that I’m planning on being responsible for a shift in how the company uses tech to complete its day to day work that will eventually transform this company into a tech startup to some degree. I want to share this experience with other software devs out there. Why? Because I believe that every software developer who wants to love what they do should be allowed to experience creative highs by taking charge of building products. And since not everyone can afford to be an entrepreneur, I want to share how you can still engage with building software products in the most unlikely of places.
The rest of the topics as per Refresh Colombo.
Visual and Creative Thinking – by Shiran Sanjeewa
Shiran Sanjeewa is the Creative Director at Elite-web-studio a Manchester Based Creative Agency. He possess extensive International Expertise on Branding, Websites, Mobile Applications, UI/UX and Online Marketing. In 2012 he founded “Shiran Sanjeewa Associates” a Sri Lankan Startup Branding & User Experience Consulting firm, now serving Silicon Valley clients with the user experience design on their software and hardware products.
I am really looking forward to this topic given how much I really care about User experiences. And from a person who has an impressive background like this this talk should really be a cracker.
Dual Carrier Cellular Networks: A Practical Outlook – by Damitha Wijewardhana
Damith Wijewardhana holds an electronics and telecommunication engineering degree from University of Moratuwa and a MBA from Postgraduate Institute of Management. He is also corporate member and a Chartered engineer of institute of engineers Sri Lanka and a member of institute of electrical and electronic engineers, USA. He has 6+ years of industry experience in Radio Network Planning, Optimization and related new technologies locally as well as internationally.
Sounds familiar? I assume this talk will have a lot to do with the recently announced DC HSPA+ network by Etisalat. And based on the actual speed tests that were made yesterday and by Shazly’s impressions of using it in Dehiwala I ‘m also assuming that this talk will probably give a bit of a rational side to the whole hype of the speeds of a dual carrier network and maybe a bit of what a network of this nature might include in terms of costs. Which reminds me I should give a call to Etisalat’s hotline to find out details of their packages for DC HSPA+.
Look forward too a live blog from me although I obviously won’t be able to blog my own topic. But in place of that I might livestream my own talk. And I will definitely blog about it as a follow up post too. Stay tuned for more information!

The Launch of South Asia's First Dual Carrier HSPA+ Network

Update: The full liveblog can be found here

Today is the official unveiling and opening of South Asia's first Dual Carrier HSPA+. I've tried to research a little bit more than what I know on the subject. The essential point of the whole thing is that you get theoretical speeds which are 6 times as fast as standard HSPA+ connections while in reality getting double the speed. Of course this is just my basic knowledge. The slightly technical bit about it in as much of a layman form as possible is as follows,
3GPP Release 8 defines dual-carrier or dual-cell high-speed downlink packet access (DC-HSDPA) to allow the network to transmit HSDPA data to a mobile device from two cells simultaneously, doubling achievable downlink data rate to 42 Mbits/s. Dual-carrier operation is characterized as simultaneous reception of more than one HS-DSCH transport channel. Dual-cell operation may be activated and deactivated using HS-SCCH orders.
Apologies for the sketchy post but I needed to get a quick introduction done before I run off to the event right about now. Will update more. The more important thing right now is that you stay tuned for the live blog that should start updating in half an hour!



Tuesday, August 21, 2012

Problems With C# Datagrid Binding Combined With Combobox

Here’s an unexpected issue I ran into today that I am currently at a loss on how to solve. When you bind a DataGrid and a ComboBox to the same source and edit the items in the DataGrid you end up with a problem of the placeholder item in the DataGrid finding its way into the ComboBox . I haven’t found a solution yet which is irritating because I don’t need a problem like this finding its way into my system at this point especially. I HAVE TO DELIVER IT TOMORROW!!!!

I found the details of this problem on the WPF Toolkit Discussion board and the only useful information it provides me is to inform me that this is the result of a bad design decision on the part of the WPF team at Microsoft.

From the discussion board:

Hi superlloyd,

Ok, so it sounds like you've got your DataGrid and ComboBox bound to the same collection, and since the ComboBox doesn't know what to do with the NewItemPlaceholder, it crashes.  NewItemPlaceholder is something which we add to the DataGrid.Items collection to represent the blank AddNewRow in the DataGrid.  However, NewItemPlaceholder should not be added to the DataGrid.ItemsSource (just the Items collection), so if you bind your ComboBox to DataGrid.ItemsSource, then this should solve the problem.

If for some reason that doesn't work, a less elegant solution would be to have two separate collections, one for DataGrid (which includes the NewItemPlaceholder) and one for ComboBox (which does not).  Whenever anything is updated or added in the DataGrid's collection, you can manually make those same changes in the ComboBox's collection, which should give the same appearance to the end user of the editing the ComboBox's collection through the DataGrid.

Thanks!
Samantha

This is just a bad decision. I’ll probably dig around the DataGrid code later and see what I can do but it’s a pity given that it’s such an essential control. I’ll probably take the ugly and terrible approach of having a separate collection for the ComboBox because I need to get this done asap!

Monday, August 20, 2012

Cloning Objects in C#

Cloning an object in C# is a surprisingly less straightforward topic than one might think it to be. Cloning is especially needed in crud applications where you want to have the ability to reset data for a form without contacting the database again and without clearing the form. Before I talk of how I used it I’ll just share how I managed the cloning.

Memberwise vs Deep Cloning

The first discovery I made was that there are two forms of cloning available. Shallow and deep. (Are those the real terms? I think I should rebrand this blog as a noob’s take on programming). Shallow cloning aka Memberwise clone is generally enough for any situation. The problem with directly saying

Object a = b (where ‘b’ is of the same Object type as ‘a’) is that as far as basic OOP concepts go you are simply copying the reference. Therefore there’s no point in this clone since any changes simply get reflected in ‘a’. To do a b.clone() call you can implement the icloneable interface but that’s actually an unnecessary step and I’m still trying to find out why bother with it

TODO: Research why one should use the icloneable interface

Back to the matter of memberwise/shallow vs deep cloning. Memberwise cloning basically takes all the individual members of your object and and copies it into a new object. The only real problem here is that you have to make an expensive cast upon returning it since it ends up being returned as an object. Well, relatively expensive anyway but since you realistically won’t be casting a billion objects in a single operation this probably isn’t too bad. The real problem in this method is when an object contains instances of other objects as part of it’s parameter group.

What happens inside when cloning memberwise.

MyClass toBeClonedTo = (MyClass)objectToClone.MemberWiseClone(); results in:

MyClass tobeClonedTo = new MyClass();
tobeClonedTo.paramA = objectToClone.paramA;
tobeClonedTo.paramB = objectToClone.paramB;

But what happens when there is an object inside the object?

toBeClonedTo.MyClass2instance = objectToClone.MyClass2.instance

This ends up just copying the pointer to that object. Which means that if any changes are made to that particular instance it affects the cloned object as well which means whatever you did can be thought of as pretty useless.

Deep Cloning

Deep cloning on the other hand is something that has to be implemented by the programmer and is about cloning every piece of information. Cloning. Not ‘Cloning’. Essentially all you have to do is return a new object of MyClass type with the variables instantiated the way you want them to. If you are in control of all the classes, you might as well call the memberwiseclone method in the other objects as well (eventually everything is made up of basic types) and put those into the constructor.

public MyClass DeepClone()
{
    return new MyClass(this.paramA, this.paramB, (MyClass2)this.MyClass2instance.ShallowClone()
}

Where the MyClass2 will have a method called ShallowClone() that calls the this.MemeberWiseClone() method. Do note that the above example I made of memberwise cloning was incorrect code and was just to illustrate the concept.

And there you have it. Deep Cloning for objects that have instances of other objects in them as parameters. Shallow Cloning or Memberwise Cloning for Objects containing only primitive types.

Thursday, August 16, 2012

Passing a string as a parameter in dbcommands

One of the final steps in building my first major WPF application was of course the database updates. Yes this was the final part. I prefer keeping the data retrieval User Experience designing at the start of a project itself. Makes a lot of internal programming issues so much easier. But that's a muse for later.

The more important thing is that I was seeing a bit of an odd issue with the data insertion for my system where the text based columns weren't getting updated. The code that I was using seemed fairly straight forward too.

sa.UpdateCommand.CommandText = "UPDATE Deal_Category SET CategoryName = ? , High_Cutoff = ? , Low_Cutoff = ? WHERE ROWID = ?";
sa.UpdateCommand.Parameters.Add(new SQLiteParameter(DbType.String,  category.categoryName));
sa.UpdateCommand.Parameters.Add(new SQLiteParameter(DbType.Double, category.high_cutOff));
sa.UpdateCommand.Parameters.Add(new SQLiteParameter(DbType.Double, category.low_cutOff));
sa.UpdateCommand.Parameters.Add(new SQLiteParameter(DbType.Int64, category.rowid));
sa.UpdateCommand.ExecuteNonQuery();

When the update was complete I discovered that the column had become empty. How did that happen? Turns out that if you pass a string as the parameter, it calls the overloaded method that specifies a string input to be the column name. Or at least that’s what I think it was doing since I didn’t have much time to read the documentation completely there. But the important thing was that my value was being passed in as null. After looking at the method I wanted to be using I noticed that it expected an object as an input. Shot in the dark; I cast the categoryName to an object and it worked.

Not entirely sure if this is the canonical way of doing this but thought I should share since it’s a rather unexpected error that popped up.

Tuesday, August 7, 2012

Deploying Databases in Click Once Applications

I'm currently in the process of building an application that needs an sqlite database deployed with it for use inside the office. This was the first time I'd need to deploy a predefined database so I couldn't even run code to create the database on first run. I needed to deploy it with data. I could have deployed it as a .csv file which was then inserted into the sqlite db that is created during the first run but honestly, if I was going to deploy a csv for that then why bother? I might as well deploy the sqlite file with it right?

This was my first time deploying a clickonce application with a file so I had to do some digging around to figure out what the best practices for doing so might be. The first step was to include the sqlite db file in the build. Answer to that was found in the MSDN How to section for specifying which files are published in an application. The relevant part can be found under marking files as data.

  1. With a project selected in Solution Explorer, on the Project menu, click Properties.
  2. Click the Publish tab.
  3. Click the Application Files button to open the Application Files dialog box.
  4. In the Application Files dialog box, select the file that you wish to mark as data.
  5. In the Publish Status field, select Data File from the drop-down list.

Next comes the code to access the database. Keep in mind that what you've written probably follows a certain folder structure which needs to be maintained. 

Not a big deal but it's a two step process that needs a bit of poking and digging around to find. Unless you happen to use Mage. In which case you should be fine. 

Monday, August 6, 2012

My first subtitle from Universal Subtitles

Universal Subtitles aka Amara provides an amazing service to make videos on the web accessible to people who are deaf and in general people who don't have access to sound while watching videos. The service aims to provide a crowd sourced method of subtitling videos which is frankly speaking, a lot better than Google's current auto subtitle method. I imagine that someday Google will reach a level of near perfection but the problem will be that it will still be, only Google's services. Other services won't be as accessible and there'll still be limitations in context that can be found only with services such as manually entered subtitles. Unless we of course enter singularity style artificial intelligence. Which is probably going to take a while given the current state of AI which as John Siracusa quite rightly describes as being less than that of a roach.

Enough rambling! Join the movement on Amara and do some good for the world. I personally have given up the world of memes on 9gag in order to devote time to subtitle a video or two at least each week. Whether or not I actually stick to this remains to be seen. But it certainly is a worthy goal to go for.

I'll write a review of the full service later but without further ado, here is my first video that I subtitled from YouTube.


Sunday, August 5, 2012

The poor state of Blogger's web interface

When writing my post on Slices for Twitter (for Android) I came to realise just how terrible the web interface for creating posts in Blogger is. It doesn't get any better in the Android version. It boggles my mind. Google is the same company that wants to push a web only usage in PCs through their Chrome OS but they can't seem to create good content creation interfaces in their own products. To give an example of just how terrible it is, this is what my original post of Slices for Twitter looked like on my blog

































Like that's exactly what it looked like. BOXES!!! Even better? How did I embed this video? It's Blogger, which Google owns, embedding videos from Youtube, which Google also owns so you'd think they'd talk to each other really smooth right? 

Wrong!!

The picture above is me putting in the URL for Slices for Twitter and that's the garbage I get as a result. Some of those results look really odd by the way. Why on earth does Google not recognize a youtube link and give me the video straight off???? Why isn't there at least an option to paste a link in??? Oh and good heavens, let's make sure we do not under any circumstances add an option to paste embed code in...

That's one scenario. Then there's the day I pasted my XAML code in. Sadly, the blogger interface was incapable of converting the special characters to their relevant HTML codes. That ended up breaking the whole post in return. Even better? When I went into the HTML and manually corrected it to say & l t ; (I added spaces because I don't know how it might get represented otherwise) the Compose view didn't show me the actual character. No. It showed me HTML code. 

Blogger has come a long way. The dashboards and Post settings and stuff are all pretty. But that toolbar on top and this big text input area in the middle is still stuck in the days when I first used Blogger.

Dear Google. 

You know what to do.

Thanks. 

Saturday, August 4, 2012

Slices for Android





Very nice looking and all of that. I agree with the review that I'm not entirely sold on the idea of slices. It would be nice if I could convert my lists into slices. A lot of the consumption mechanism feels like a hybrid of  Twitter lists and Google+ Circles. Slices even decided to implement their own discovery tab with various topics containing various slices of people curated by them for people to 'slice'

Is it just me or does it sound like a really really really bad idea for them to come up with a client like this in the wake of Twitter giving the cold shoulder to developers who might develop apps that take eyeballs away from the streams controlled by Twitter? Like the discovery tab. This is like a big part of Twitter's financial future and right here we find these guys implementing their own which will probably not see any promoted tweets or anything of the sort.

Kudos on the great design. But honestly, the developers are either really brave, really stupid or just plain Trolls.

Slices for Android on the Google Play Store

Friday, August 3, 2012

Steve's Apple vs Tim's Apple

Time posted a question on their Google+ stream asking users to weigh in on whether Tim Cook's Apple was failing after the great man, the late Steve Jobs, passed it on to him. Steve Jobs of course will be remembered for passing on the advice of not to do what they thought he would do in a similar situation. Having said that,

I won't compare Tim's Apple to Steve's Apple. But I will look at Tim and the legacy he was left behind. As much as people want to harp on quarters, one has to admit that Tim was left behind quite possibly one of the most difficult legacies to handle. The iPhone was entering its 4th generation and there was now very little in terms of visible or truly revolutionary advancements going on. The Mac book pro was also at a hardware apex. Everyone was guessing (and correctly so for once... which just goes to show) what the next features would be.

Tim was basically left a company that was at its apex which meant that competitors had started catching up and Apple's advancements had pushed its competitors into taking revolutionary steps themselves. Revolutionary steps lead to lots of attention in the tech sphere and this attention can have a ripple effect causing consumer curiosity. And most importantly, in the midst of revolutionary steps being taken by companies, people are starting to look back at Apple, the company that started a lot of it to see what it will do in return. People don't get it. Apple is at that stage where they are perfecting things, not revolutionalizing. As such, people are looking for something that's unfair. You can't keep revolutionizing something without alienating your fans who find that their 1 year old purchase no longer has any support.

Having said that however, Apple is probably on their last year or so that they can continue to touch up what exists. If they haven't started yet, it's time for them to look at what they can do to really kick up a storm and get people talking and saying that the Apple spark is back. Many people forget that while Steve Jobs was a brilliant man, his most brilliant revolutions came about and from that point the company iterated to make things perfect. Given the design of the iPad I still consider the iPad an iteration of iOS devices. Revolutionary though it may have seemed it was still essentially an iteration.

Essentially, it's too early to judge Tim Cook's Apple even if there are numbers that show a slow but sure shift in balance of power. Within the next 18 months, if Apple cannot introduce something to take the general consumer market by storm, then we can start the judgments. 

Amazon's Awesome Customer Service

Last night my Kindle broke. The reason for that could be an entirely separate blog post on its own given that its a mystery. What happened though is I didn't use my Kindle for just under 3 weeks and during that time it had discharged completely. When I went to get my Kindle yesterday to charge it so I could finish reading one of Terry Pratchett's books I discovered that it had the charge sign but there was an e-ink 'stain' on the bottom left. Like the e ink hadn't been able to refresh that part of the screen. I charged it completely, switched it on and was dismayed to find that the stain remained on the screen. This was really really upsetting for me since I love my Kindle and in the case of this one it was the newer 6" model and I hadn't even used it for 6 months!

I poked around on Amazon to find out how to troubleshoot my Kindle and came to the obvious conclusion that my screen was broken. This was step 1 of my really awesome experience with Amazon. It didn't take me a lot to ascertain what had gone wrong with the device. Step 2 was that the return and repair method was very plainly linked and explained. The main numbers I needed to call were given and this is where the best part of my experience began. Upon calling Amazon I had less than 30 seconds wait time upon which I was received by a customer rep named Brian. Whether or not this was his real name is irrelevant to me because from that point on the service I was given was nothing like I had ever experienced before. I explained the issue to me and his only real question about the problem was whether or not I had dropped it. I said no and that it had been in its case the whole time and so he agreed that the screen was probably broken.

To cut the long story short, at this very moment, a brand new Kindle is on its way for me; when Brian asked me for my delivery address I found it too good to believe that this was actually happening and was over the moon. To send me from over the moon all the way to Mars was that the email I got about an hour later stated that a brand new Kindle would be shipped over, any customs taxes that I might incur should be faxed over to them so they could handle it and finally all shipping costs to send the malfunctioning unit back to them would also be refunded if I gave them the tracking code.

Throughout this whole experience was the feeling that this customer service rep knew exactly what he was doing and more importantly, knew every relevant detail about me. Even when there was a minor hiccup due to the fact that I was calling the US number for a Kindle that had been obtained as a gift from the UK he just said give me a little bit of time and sorted it out transparently by informing me that he'd fill a non standard form.

One call. One word to describe it.

Wow.

Wednesday, August 1, 2012

Google Calendars and Change logs

Since I never use calendars with anyone else the idea of having a version control or a change log with my Google Calendars never occurred to me. But speaking to a friend who just discovered that a whole set of calendar events added by a co worker had gone missing was a big problem. I've had this issue in the past with Google docs where on rare but annoying occasions, data suddenly went missing after the most recent edit. Like the data that put in with the last edit just went missing but showed up in the version history.

Aside from unpleasant vanishing data, there's the real concern of data conflicts when collaborating on a calendar. Appointments for a particular day are being put together and by mistake someone wipes out the 2 PM board meeting with the potential investor, and you are asking for a lot of trouble. Point is, whether these are edge cases or not, Google Calendar has the option for sharing and collaboration. And anything with collaboration SHOULD have version tracking built in with it. This is NOT optional.

Looking for threads currently where this feature has been requested. And I found a lot of them in the product forums for Google Calendar. Wonder why the Google team hasn't responded. Going social and all of that, you'd think that would be a little important at least. 

Google and Microsoft? Goodbye Apple?

Slightly insane thought... but wouldn't it be crazy if Apple goes out of fashion and Microsoft and Google are the in thing? Why do I even ask that? The apple eco system is big and has a lot in it. But the more I look at it the more I think yea.... but nothing exciting gets added anymore. It's just feature updates and catch up features.

I like Microsoft. I always have. And their eco system is tying together slowly but surely and very nicely and has a lot of things growing in it. Like I could see my coding going social and being connected with github and irc withing visual studio express. I don't see apple ever doing stuff like that. Like you have to be some kind of elite to be on their share menu.

Google has an amazing eco system that's looking like it wants to add exciting stuff everyday. The way they've redefined their UI, whether or not it felt more metroish is irrelevant, shows they can be bold. The iOS interface on the other hand hasn't really changed in anything that can be described as bold.

It's just an exciting thought for me. I by default dislike keeping my thoughts in that of the pack and I like to consider scenarios outside of the expected. And I like this scenario. It seems plausible and I want it to happen. There's room for only two. But when the third keeps nudging to get room the dominant two will fight to innovate. And that's what apple did to get on the bench. Followed by Google continuing to shove Microsoft off the bench. Followed by Microsoft making some pretty bold decisions to try and shove Apple off the bench in return. It's exciting and my money is behind Microsoft and Google.

Monday, July 30, 2012

Bug in Blogger?

I just attempted to publish a blog post on how to apply common themes in WPF but if you are looking at it you'll find that it ends rather abruptly. I was pretty upset thinking that Blogger had lost my post. I just wanted to take a look at the HTML those instead of just the compose window text and my suspiciouns proved to be right. Everything had saved fine but since WPF uses markup to define its layouts, the markup I had pasted in as code happened to break the HTML as well. Bit of a fix I am in now so if anyone knows how to solve this issue let me know. 

Common Themes in WPF

An interesting issue that might present itself to any WPF developer is that there is no apparent way to set a common style for a particular control or even for a form as a whole. At least, the method is there but the properties to use seem well hidden and don't really present themselves till you get to advanced sections of a good WPF book. Having obtained the solutions to these problems I thought it was only fair that I post out on the internet what I found.

For applying themes to a common control such as a button,



A quick look at the code will show how akin this is to the practice of CSS. There are styles that have a name, a target, and the property which they affect of those target controls. From that point on there are styles which are based on (inherit) the previously declared attributes that can be more applied in a more granular fashion. While this is pretty good, it turns out that if you want to change the font sizes and colours for an entire form, the solution is far simpler than this. 

Yessir. That's all you need. I would suggest playing more with the TextElement properties to see what else can be changed. Essentially, by setting this value you set the style for all controls that follow inside that. There are other elements to play around with but this is it in a nutshell. 


Happy coding!

Tuesday, July 24, 2012

Lift.

Lift is a simple, positive way to achieve any goal. A while back you signed up for access to our beta (or one of the current beta testers invited you). Today, we're inviting you and a handful of other people into our beta. Welcome!

This is an early version of an iPhone app that we plan to launch in August.

To participate, open the link below on your iPhone or iPad running iOS 5.0 or newer (sorry, we don't support other devices yet).

Well that just ruins everything

Check them out

Saturday, July 21, 2012

Free to write

For the last few months I’ve been programming most of the day but I haven’t unfortunately been free to write about all the discoveries I was making as well as techniques I was coming up with since I was working on a game for the company that was built on top of XNA and the controller happened to be the Kinect. The project was pretty hush hush. But it’s now out in the open. Expect a few more details soon.

On another note, yes this is one of those “I am back” blog posts, I’ll be trying to make a point to blog almost daily, at least every other day on the many discoveries I’ve been making while at work, and the occasional rant and muse on the state of tech today.

Look forward to it kitties

Thursday, June 7, 2012

Calling scripts from a python script dynamically

 

This is also an interesting thought in architecture. My scenario was that I needed to go about creating a program that gets data from various sources and returns a set of structured information regardless of the source. Each source may or may not have the same data or the data available may need modifying in order to get it into the structure needed. Every source was unique in the way the data should be called. And over time the source could change the way it gave its data meaning that a break in retrieving information for a single source could happen unexpectedly and therefore care had to be taken to build an architecture where the system would not break because of a single fail point.

A key point to note is that once the information was obtained, the way it would be further filtered, organized and stored would be a common process that would not be dependent on any source of information.

So I figured the best way to build this would be to have a separate script that works with the information provided which could advance further over time. And then build a separate script for each source or multiple sources depending on whether or not the method was common enough and these scripts would be added to some kind of config file where they should be called. So there are two ways to do this config file. One would be a separate python script, something like ‘scriptHolder.py’ which would look something like,

import scriptA
import scriptB
import scriptC
#import the rest of your scripts here

#declare a single method in the class to actually go through all the scripts in which there is a common method to call. Probably specified by documentation
def commonFunction():
   scriptA.callMethod()
   scriptB.callMethod()

Once you’ve done that you’d simply add an import statement and a method call for every new script here. The problem here is that there’s extra code that has to actually be written and overall it’s pretty annoying in a small way that makes you just wish there was a more elegant solution.

So the second config ‘file’ isn’t actually a file. But you basically get the program to check for more scripts in the folder and import them all in at run time and execute a common method that has to be there. So your main script would probably go something like,

def findAndExecute():
   #find the scripts. For each script located,
   script = __import__(‘scriptName’) #strip the .py out
   script.callCommonMethod()
   #call whatever other methods in the framework to organize data

The key here is the __import__ statement. Discovered it today and the documentation says that most python programs won’t need it. So I’m not sure if my architecture is correct but it works pretty darn well right now. So I’m rolling with it right now.

Saturday, May 26, 2012

Learning XNA 4.0

 

So for a recent project that I need to work on, I’m building a game using the Kinect Platform and to build the game itself I’m building it on top of the XNA 4.0 Framework. Now I learnt the fundamentals of making an XNA game back in 3.0 and stopped at making a 3d game since at that time all I needed was 2d. In fact, all I need even now is 2D. But it’s been a while and I prefer to keep just concepts in mind in the long run since memorizing syntax is time consuming and inefficient in my opinion. So, to bring myself up to speed I decided to get myself the book “Learning XNA 4.0” and go through it to brush up my knowledge. One of the things I needed were the sprite-sheets that I didn’t have anymore. And that’s when I ran into my first strange roadblock.

“Where are the resources??”

I combed through the book and when I couldn’t find anything I decided to head over to the O’ Reilly website page for this book and it seemed that I wasn’t the only one facing this problem. A large number of the one star reviews for the book were all about the fact that the source code samples weren’t linked anywhere in the book and couldn’t be found on the page for the book either. Smart job editors!. Long story short. I finally found the resources.

Go download them here (45.9 MB):

Sunday, May 13, 2012

Bing what the what???

I love pressing those buttons that let me the first person to try out the ‘all new design/version/beta thingamabob’ that’s coming soon. So when I was presented the prompt for Bing, bear in mind I haven’t checked out any of the stuff on the internet yet to know what it looks like, I excitedly hit YESS OH YES OH YES SIGN ME UP FOR THAT!!!!

Well… Whatever was the equivalent of that at least.

And my first search result gave me this.

image

I’m sorry. Did I miss the memo that we had returned to 1024x768 screens with no kind of CSS to help with responsive layouts? If not can someone from Microsoft explain just what the hell that white space is to the side. (Circled quite obviously in blue by me).

I love white space. I love clean. But I Do.Not.Like.A.Vacuum.

If I want outer space I’d rather save up for one of those outrageous space tourist things. Anyone knows what happened to those?

Hello Google+

You had better be my answer you know. I come to you in a time of sickness so don't make me regret this to be a time of delirium. I'm tired of Twitter. Tired of Facebook. Tired of Foursquare. Tired of Path. Tired of Instagram. Tired of Flickr.

Not tired of each one individually but just tired of managing them as a whole. It's not social media burnout but I've got a lot to do during the day and if I have to visit multiple places and keep multiple accounts on the same network to piece apart the streams into a cohesive whole then the system is broken.

So I turn to you Google+. In my moment of desperation I turn to you to find peace of mind where I may journal the pieces of my life and document the fragments of each day's discoveries. To discover the moments within the lives of others based on context and interests. I don't need Banjo. I don't need multiple widgets. I just need one. Will I find ease of navigation in my separate circles of photographers and techies or will I once again be overwhelmed at the multitude of circles and resign myself to accept that I should stop breaking myself?

Time will tell I suppose. Your move Google+. Let the circles begin.

Saturday, April 14, 2012

The Grand Windows Phone 7 Challenges

And what’s wrong with it.

From the very start, Windows Phone 7 has been all about the phone that is going to save you from your smartphone. With the ‘really?’ ad campaigns, to the ads of people falling over themselves trying to get tasks done as the Windows Phone 7 users breezed along, to Steve Ballmer repeating over and over again the mantra that went something like ‘get in get out and get back to life’ and all the way to the smoked by windows phone 7 and blown away by the Lumia 900 campaigns it’s always been about the fact that your smartphone doesn’t weigh you down from experiencing the world. I think that’s a pretty awesome concept but I think that Microsoft might actually be giving off the wrong idea or at least an idea that doesn’t really resonate with a lot of people. Especially iPhone users.

Have you watched people walking around these days? Even when best friends or husband and wife walk around together, instead of talking to the person right in front of them they have their heads buried in their phones. Sometimes there’s a very strange form of social interaction going on where they interact with each other across the social media. It’s creepy but it is what it is. And why the Windows Phone 7 message doesn’t resonate with people is that the reason that people have their heads buried in their phones is not because it takes them 9 – 15 seconds longer to check in or post a status update to Facebook. It’s because they are just so immersed in whatever is going on in their online social life. No matter how fast their phone runs or how easy it is to get tasks done it doesn’t matter. People are just going to keep tapping away at their phones.

I do respect and I totally support the message that Windows Phone 7 is trying to give out. We’ve all just gotten way too caught up in our digital lives and its time to free ourselves up. But that’s just like what people thought inventions like the microwave and washing machine would do for our lives. Take the stress out and do jobs for us faster. But instead all it did was just add to the stress as we filled up that free time with more and more work. It’s the same thing with Windows Phone 7. People using it on account of actually being able to get more done in a shorter span of time (and in the social media world, 10 seconds is quite a bit of time) they are actually going to get more and more immersed and fill up the new found time with even more social media activity. And that just breaks the whole campaign down. Because people don’t want to be spending less time. They want more time and they want to DO MORE in the little time they have.

I honestly think Microsoft would have been better off with ad campaigns in the lines of people all standing around in a crowd and trying to capture moments such as and eagle landing in front of them. And while people are desperately trying to get past their lock screens, the person with the Windows Phone 7 has taken the photo, tagged the friends, uploaded to Facebook and is back to looking at the eagle fly away while people are still with their heads buried in. So much better noh. Best of both worlds. And that scheme is resuable. Concerts, bowling, all of that. And this is all off the top of my head. I think the keyword here is the comparison. That’s why the whole campaign of smoked by Windows Phone 7 is actually half decent, because it’s all about comparisons. In fact there’s a large social acceptance aspect that causes people to buy smart phones that Microsoft isn’t playing with. They have to make the phone look cool enough that the friends surround the person with the Windows Phone 7 device. And I’m not talking about the cheapo ads that involve women rushing over to the guy. I’m talking about genuine ads like friends saying check out this album and struggling to open an app and find the album while the guy with the Windows Phone 7 looks for a bit and shakes his head before opening the pinned shortcut to the album on his start screen and takes control over the show.

Noble as Microsoft’s message may be, I think they are in the wrong business to be giving that kind of message. You don’t come into the smartphone competition to try and save people from the digital eco system. You come in to make their lives easier, more fluid and a lot more enjoyable within said eco system. If you want to give a message like the one that Microsoft is currently giving at least mix it up with being able to experience the best of both worlds. Like the example I gave of the eagle advertisement. A lot of smartphone activity cuts into the moment right now. Make the whole pitch being able to live the moment in both the real life and digital life.

Only with Windows Phone 7.

Sigh. Who comes up with these ad campaigns anyway?

Monday, January 23, 2012

The takedown of Megaupload

And more importantly, why a lot of people are thinking of it the wrong way. Unfortunately it seems that most people are making two mistakes. First, they seem to be connecting this takedown to the crazy uprising against SOPA. And as a result they are making this move of taking down Megaupload seem like something terribly wrong.

They are all wrong.

Taking down Megaupload was perfectly legal, but also coincidental with the massive uprising; I don’t believe it has everything to do with said opposition to SOPA. The important thing is that opposing the move to take Megaupload down doesn’t make any sense and it just shows how woefully ignorant most people are of the facts.

Firstly, this isn’t how SOPA is supposed to work. This is exactly how SOPA doesn’t need to work. If SOPA is passed then they don’t need to go on gung ho missions arresting some hick way out in the boondocks just to get a site shut down. They are free to remove the site from the directory of internet addresses inside the USA while granting protection to the ISPs, aka providing indemnification against any and all law cases.

Second, the evidence against the Megaupload CEO and his employees is absolutely solid. And this is by far the most important point. Emails captured shows communication between employees displaying their knowledge of the pirated content on their website and advocating the use of it as well. If that wasn’t bad enough, in their allocation of uploader reward points, inspection of accounts shows that they had knowledge that the uploaders were guilty of uploading a few movies, some porn videos and several key generators/cracks. Yet they still went ahead with the allocation of uploader points and thereby directly supported piracy within their file sharing network. Despite all their publicity stunts of them wanting to take down other file sharing sites saying that Megaupload is the only one that actually respects take down notices, the evidence shows that they were probably amongst the worst offenders themselves.

Third, the evidence being collected against Megaupload has been in the making for two years. That isn’t something that took place just yesterday in time for the uprising against SOPA. The people behind the takedown really waited it out for the right moment to sting and whether or not there were external powers at hand wanting to send a message to the people calling for the death of SOPA the fact remains that there is no direct connection to the taking down of Megaupload. The take down was simply an act of justice and one that is absolutely justifiable.

Finally, I see a lot of people also forgetting the core concept that piracy is not a good thing. Apparently because musicians and movie directors are living on piles of money, its ok to steal their works that they put a lot of time and effort into creating. Granted, the media industry is full of backward folk but that doesn’t give anyone the right to say that piracy is something that shouldn’t be looked at as a bad thing. Piracy is stealing and if you are ok with that then good day to you. Quit supporting the people who are against Megaupload. It hurts us who are trying to stop bills like SOPA going through. It doesn’t work that way. When people oppose things like the Megaupload takedown it just goes to show that the majority of people really don’t know what they are talking about and that’s the reason people in congress and the senate aren’t sure whether or not to take most of this opposition seriously. It seems like every time there’s some bill related to anti piracy, most people call it Armageddon and immediately ask for it to be taken off. It’s not like anyone suggests any viable solutions either. How do you take a bunch of people seriously when all they do is complain about every solution proposed to stop piracy. How do you take them seriously when they never seem to be satisfied and never seem to provide any good solution themselves? How do you take them seriously when they begin complaining even when a completely legitimate takedown such as that done with Megaupload is also opposed. When people fail to make sense and simply sound like a pack of brainless parrots in an echo chamber, the effectiveness of the real message that genuine people are trying to spread gets lost.

Here’s a little something to think about. How many people do you honestly think, who upvoted all those posts about SOPA and PIPA on 9Gag, actually took the time to study the bills and find out what really makes them such a disaster? My bet is very few. If there were really a bunch of people who were serious about this then the posts about Megaupload probably would not have really bubbled to the top. Bottom line. Take some time to think before just opposing everything that comes down the pipeline. I made that mistake and I can guarantee you. When it eventually hits, you really understand how foolish it all is.

Saturday, January 21, 2012

Big media, SOPA, and the real solution.

A while ago I recall reading a statement from the CEO of Steam and his opinions on game piracy. I believe the man is a true visionary and it is people like him that ought to be put in charge of drafting anti piracy legislations in congress. That or the media corps ought to take a few lessons from him in how to combat piracy because he is one of the few people who believe the way forward is offensive and not defensive.

This may seem like an odd statement given how offensive the media corps are these days. But the reality is that the media corps just appear to be on the offense with their legislative motions like SOPA and PIPA and goodness knows what will come next down that crap line. These acts are really just acts of defense where the media corps want to huddle up in their traditional business models and traditional media formats while lashing out at people each time they see some site pirating stuff come up on the horizon. This is just plain stupid and it’s the reason that the tech community and so many others are standing up against whatever the media corps do to shut down piracy. People want a way forward. Out with the traditional plastic discs. In with digital. But no. Media corps are scared because the new medium is just so much more transferrable and therefore easier to pirate. So they go on the defense and lash out against the digital medium and decide to hole themselves up with their good old fashioned DVD discs. And then they wonder why people rise up against them. It doesn’t even make sense anymore. To say that DVDs are a good form of protection against piracy because they are less transferable is stupid. I could in two hours rip a DVD and share it with my friends if I wanted to. But of course the laws of ripping will stand in my way right? Again. Defensive action. They try to stop the people from ripping DVDs by saying that they can sue them in court for it. The media corps think that putting some kind of code around that will make the DVD format locked out to others. But I’m sorry. That doesn’t work. People want to go digital. They don’t want to be looking at this mountain of discs when they can instead have one 5 TB external hard disk which contains everything. Even more important, with increasing internet speeds, people don’t want to even have the 5 TB hard disks anymore. They would prefer to just be able to stream everything.

But of course the media corps continue to ignore these signs and prefer to stay holed up in their little closed up worlds. As Steve Jobs quite rightly put it, “You guys have got your heads up your asses”. Now what did the CEO of Steam say that clearly showed that he had the right idea? On piracy, all he had to say is that it isn’t about just the game anymore. It’s about the service provided with the game. If a person finds that ordering a game DVD doubles or triples the cost due to shipping costs to his/her country and has a waiting time added to that, then piracy becomes a natural option to turn to. After all, the pirates are providing a better service than the game company themselves. But in the case of Steam, the CEO believes that the Steam platform provides a better service to gamers. Whether this is true or not is debatable but whatever said and done, the man has the right idea. After all, now through Steam I have so much more enabled. I can no download the games like I would a torrent and I can get the game for free. All patches and future updates and expansions become just as easily and readily available. Online gaming with other people becomes easy, leaderboards with friends is automatic and even buying a game level by level becomes possible.

Imagine this with the media corps. What if they put all the effort they are putting into foolish bills like SOPA into making a media platform greater than any other. What if they made a platform where all movies could be streamed with ads or without ads depending on how you wanted to pay for it. Imagine a platform where people from anywhere in the world can no get a movie without having to worry about shipping costs and other unnecessary details like having to deal with storage of physical goods. And imagine a platform where users are automatically connected to fan pages and merchandise, and where buying a movie gets them inside access to behind the scenes and other constantly updated media such as director’s and artist’s notes. When it comes to advertising, people can even start offering advertisements to make the program seem more local. Now when I watch the tv series/movies for free via this new platform I won’t have ads from geico but instead ads from my own country which makes more sense.

This is the kind of stuff people should be thinking about. Not how to stop people from downloading the movies off torrent sites. Media corps need to instead answer the question, “why do people even want to pirate something???” There really aren’t that many thieves in the world. People just want access to something with ease and if the media corps aren’t going to provide that access then they might as well turn towards someone who does. And in this case that someone is, the pirates.

A good example of how media corps can be so stupid comes from CBS and the big bang theory. Today I went online just to check what episodes they have, and it turns out that they have one episode from season 3, 12 and 13. I mean really? And there’s no way I can even pay to watch those online. So what have I got to do? Either buy it from Amazon, or take a trip down to a DVD store and buy a bunch of physical discs I neither want, nor need. Even if I really don’t want to pirate, I all of a sudden don’t care. The pirates provide me with a better service so why should I go to CBS? Defiance sets in. And I give the finger to CBS and go ahead and switch on the torrents to get my episodes from season 1. Good move CBS. Really well played.

‘Nuff said. Get your act together media corps. You’ve shown the world just how much influence and power you have. The world is seeing an unprecedented growth in people wanting to develop stuff to make media more readily available around the world. Use your money and power to grab those developers and set them free to develop a platform like no other. One that enables the world, to become one.

Wednesday, January 18, 2012

I feel like a dork

Biggest mistake ever. The other day I was doing a Facebook engineering puzzle using C++, and the puzzle description warned about performance and how important it would be to go through a large amount of data real fast. Before I go any further into that, let’s talk about Project MADA. The now defunct ho ha of mine.

When I was doing the project back into university, I had to look up bi grams within a large amount of text. Essentially I had to keep running tab of all possible combinations of two letters {aa,ab,ac,ad….zw,zx,zy,zz}. That’s 2^26 combinations.

My solution to this problem turned out to be, a list, where I would store each bi gram along with its corresponding count as a tuple. Thus, for each bi gram I read into the system, I would go through the list to see if I had a mach. If yes, I would increment the count by 1 and if not, I’d add the new bi gram along with a corresponding count of 1.

Back to the Facebook problem, which is liar liar. There’s a forum of users and you have a list of people who have come frward to name the others as liars or truthful. From this knowledge, you have to derive who is truly truthful and who is just a big fat liar. The logic for this is a no brainer. The implementation however requires keeping a track of number of accusations corresponding to a name. Now I could have used by bi gram method, but for some reason, just because I was doing a Facebook puzzle, all that was in my head was efficiency. And my first thought was, hash table (Map in C++) where the key would be the person’s name.

Two days later I was thinking of my FYP, and I just realized how stupid I must have sounded coming up with my method to insert stuff into a list and keep track of it when the hash table was there along for me to use. Instead I had an implementation which was utterly inefficient, (nested for loop, traversing through list each time to find element. Talk about a worst case) and what’s worse, I feel like a dork because I stood in front of my lecturers proudly showing off my implementation as it was the best thing since peanut butter and jelly. I can’t even believe the hash table didn’t occur to me.

I can’t help but wonder, if they thought I was a bit of a dud for not using a hash table. Sigh.

On a side note, it truly is amazing just how differently I thought simply because the thought of Facebook efficiency was in my head the whole time.