As of 2016-02-26, there will be no more posts for this blog. s/blog/pba/
Showing posts with label Google+. Show all posts

An open source project led me to read Go on Wikipedia, which mentioned there already is one under the same name except, it has an additional !, that is Go!

The creator of the language asked Google to change their programming languages name, because he feared the big guy will end up steam-rollering over and I think he was right, because nobody had heard of it and even its Wikipedia page was created after Googles Go page.

The issue was marked as Unfortunate and with a comment: there are many computing products and services named Go. In the 11 months since our release, there has been minimal confusion of the two languages. I have never heard of any named Go until Googles Go.

Of course its minimal, because no one else would ever know about Go!. When you search Go! programming language in Google Search, you get no Go! in first three pages and probably never would see any.

Dont be evil and Do the right thing are just slogan and motto sound nice, dont believe anything corporations say and do which you read on media, its all the publicity, nothing more and nothing less. They are created for one thing: profit.

I began to use Block unwanted website in late February, just a month later, it's been broken since then for almost 4 months, a discussion was started back in late March. Like some Google products if it's not current hot product like Google+, you often get late response or even nothing from Google's staff. Lucky this time, we did have a couple of replies from Google employee.

Mar 18, the OP posted about the issue, three months later, Jun 19, finally a Google employee replied to acknowledge the issue. It's better late than never, right?

A half month later, July 4, second reply from same employee said the team is working on the issue and provided barely a workaround for unblocking, which I don't need and I already know that unblocking function. The most important part is still no mention about the problem, why the function doesn't work.

When I first noticed the issue, it's like someone pulled a minified JavaScript from Google's server deliberately.


To me, it doesn't look like something is actually broken because of coding. Like I said, it looks like being pulled, therefore the functions are not available.

I don't now what the real cause is, only Google knows it, but I guess we, the users, must have blocked a lot of sites. 500 sites allowed per user. It's a lot. I think maybe Google can't handle that kind of per-user filter. But that's only my guessing.

The thing is Google must know from beginning if they did pull JavaScript and I really hate Google for late response or lack of proper handling. Beside, suddenly when no one blocks websites, they must have noticed the database stopped growing. No way on Earth that they didn't know when the issue appeared. They are Google, this blocking data is worth to make some statistics even they have never planned to use it for ranking algorithm.

If they want to pull the functionality, it's fine by me. But they need to tell people, just put up a notification saying the function is temporarily disabled, that's really okay by most of people. Disappointed, yes, but much better than unknowing to the cause.

I sincerely feel I have become more and more dislike Google's way to manage things over recent years, and this case is just one of the reasons. They keep talking about government transparency, but they aren't even transparent enough to tell us the cause. They don't need to tell the technical detail, most people wouldn't understand, anyway. A simple summary would satisfy us, who have been waiting an answer and a resolution for nearly 4 months.

If you own a domain name and have searched using it as search keyword, you may have seen this kind of results in the screenshot on the right.

The last one on this page is totally legit, the first one is okay, but between them are websites which I categorize as garbages. This kind of websites is a variation of content farm. It's not like usual content generation, but using domain-related data to produce content to fill up the page, so it would look something in search bots' eyes. They will grab all sorts of results via API of other services and gives you some whois information.

That's really for noob to read, who don't know about where to look for information about a domain from the original sources, mainly for their own domain. I don't think one would want to read other's domain information, generally, at least not to read from this sort of trash websites.

Unfortunately,  Googlebot crawls and indexes this kind of websites. The time range in the screenshot about was set within 24 hours and the search hit 75 results (and it has increased to 80 while I am writing). Sadly, I haven't been able to use blocked unwanted sites, which is a feature of Google Search. That page has some JavaScript error, it's been broken for days since I've noticed. Don't know where to report a bug except using that community support forums, and I do not want to use that. Just another typical Google support method nowadays.

An interesting point is my domain is not even the focus of the matched results. These websites will put a list of domain names next to basically unrelated domain names, so they can somewhat increase search engine hit ratio. It's cheating, I would say, and Googlebot isn't that smart to know that.

I was looking for an API for data in Public Data Explorer, thinking about presenting the data in different visualizations (BigQuery, probably is the right one, Google stuff is too many and too confusing), then I saw this:


Who would list cURL as programming language? Google really needs someone to proofread. I am sure this mistake was made by a writer who added the entry for that Bash example code which utilizes cURL to retrieve data.

As a developer, you wouldn't make such error, this isn't even a simple typo. It must be written by whom didn't have basic knowledge of development environment. They can either let a developer write the draft, then a writer proofreads, or the other way around. Anyway, I doubt this page is proofread.

Last time, only a few days ago, I mentioned issue in Google Docs Spreadsheet's and Analytics documentations. The underlying problem is the same. There seems no developers involved in writing documentation (outside of code), that sounds ridiculous, but that's what I felt after I read their documentation.

And there is a more important issue about the code and other on Google Developer, see the following screenshot:


There is no notes about the license of the code, except the copyright statement in the comment block. If I read this kind of code block anywhere, I would not even try to read it.

The question to be asked is: may you use this example code freely? May you even select the whole code and press Ctrl+C to reproduce?

I don't understand why Google neglects or forgets such important thing: noting copyright and license in every page. (They have legal department, don't they?) To see the permission and license, the fastest way is to click on Terms of Service Site Policies. That's two clicks and only if you know where to read, and I really need to quote:

You will find the following notice at the bottom of many pages on the Google Developers website:
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 3.0 License.

[snip]

You may also find the following notice on the bottom of some pages:
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 3.0 License, and code samples are licensed under the Apache 2.0 License.

Okay, which one? Wait a minute, I see neither. WTF.

Google announced their latest update on Transparency for copyright removals in search. They receive more than 250,000 requests per week, it's not a shocking number by my expectation. If people knew we can submit removal requests via a webform, I am sure the number would be much higher, and I am intended to submit my removal requests in the feature.

I have seen some of my contents were duplicated on Google Search results, though some claimed that's for mirroring lest sudden unavailability of access, but that's just pure non-sense in these days. Unfortunately, I licensed some works under the Creative Commons, which rendered me useless to take any action if I wanted to.

I have written a post telling you not to license your work and you should follow my advise, so you can take down those garbage websites when they ripped off you Flickr photos or your articles.

Recently, one of my popular-free-to-rip-off Flick photo gets ripped again, which is not even a good photo by any means. those rippers don't even care about the quality of the photos. They find whatever the photo relates to the keyword they need, then just shamelessly use the photo with same generated text in multiple sites.


It's not only the copyright being violated but also your reputation, which is a severer issue for everyone. See how my username gets attached to that page in red box in the preview image? It uses "by livibetter," that may read as if like 'this page is written by livibetter' if the reader doesn't read carefully. I doubt any human would even pay attention to read that page. However I am more consider how Google's bot thinks about it than real people and you can see the text preview, it sort of confirms my worry.

Sigh... if I didn't license that photo, then I could take it down since it would be violating my copyright. It's not only good for me but might also help a tiny bit for other people. For example, you wouldn't see that entry if your search unfortunately could hit on it.

There are plenty of pages like that if you have tried to search your contents, I am sure you will find your works are being violated. This world is filled with good people but also shameless people as well.

In the Google blog post, there is one thing very interesting:

For example, we recently rejected two requests from an organization representing a major entertainment company, asking us to remove a search result that linked to a major newspapers review of a TV show. [emphasis mine]

So, shameless people, again, huh? They do whatever it takes to abuse you in every way they can to screw you up. Producing bad TV shows result bad ratings, that is the perfect logic. But they want to fix it by muting other people's voice, how absurd is that? They maliciously use and copyright claim as censorship tool and attempt to manipulate people to think that's legit and ethical, sounds familiar to you? Those people truly have no moral standards in their hearts, wait! I doubt they even have a heart.

I wish Google would release such information in the future, I tried to find about denials in Transparency Report, but I can't see anything about the result of a request. That would be great to see those people being exposed their ridiculousness to the public.

By the way, GitHub has a repository about DMCA takedown requests they receive, read those commit messages, a few of them are quite funny.

Go search zerg rush and see how you do it.


(In case you don't know yet, GG means Good Game)

I use touchpad, so these Oerglings must haven't got their upgrades. SlOling, I assumed. Wish, I had the blue flame...

Where is dat EPM, anyway?

Even with all these systems and people working to stop bad ads, there still can be times when an ad slips through that we dont want. There are many malicious players who are very persistentthey seek to abuse Googles advertising system in order to take advantage of our users. When we shut down a thousand accounts, they create two thousand more using different patterns. Its a never-ending game of cat and mouse.

This summarizes up how worse those bad and abusive people are on the Internet, even Google confirmed it and has to deal with them with much efforts in order to keep them away. I recommend that you read the post, just to know how serious the situation is and what steps Google are taking to fight against scammers, though I don't feel Google is the one winning, but it's trying.

I read it because I need to read something, I am forcing myself to read everyday. Not I was interested in this topic, but I was glad that I read it with patience. Google has a system which involves three different aspects to decide and co-judge whether a subject is bad or good, do they (machine or human) need to take further steps or just block, and so on.

You will find that human truly is the last defense in all three aspects, they make the final call when machine can't tell. Like other modern system, decision isn't made by fixed rules, they can evolve and learn from mistakes (errors, which human points out), still, human is the most crucial component.

If I recall correctly, I only reported once on a Google ads. That ads tricked me to click on a text saying "Close," and I did click on it. It was instinctive reaction when that ads kept flashing strong colors like between red and yellow. It was so annoying, although I rarely saw such kind of trash ads via Google's advertisement system, but I did that time.

Out of natural instinct, I clicked on that "Close," however that would never got you to close that ads. Instead, I was brought to the ads' website. I was fooled, clearly.

That was only time I saw a bad ads via Google, I think Google does a good job on protecting their profitable advertisement system. There might be more, only I hardly pay much attention to ads, and my eyes and brain are well-trained ad-filter, but some slip through. I care more about email spam since it's main source I would see a spam or phishing email.

I would believe that Google has actually brought those scammers to court, they had done so to spammers. They have all information of all advertisers, so there should have no problem to file a lawsuit against a scammer. Name, address, phone number, credit card, everything they need is already in Google's database. Some could be fake, but it can't be all. Identity thief, come on, this is not a drama.

Anyways, I am looking forward to more posts like this.

It always is a wonderful experience when you witness an evolution. Since the invention of typewriter, only low-skilled users would require 10 fingers to operate on a keyboard. Until today, it remains perfectly true.

As technology progresses, computer entered our daily life and has been dominating every corner in the entire world since then. At the beginning of commercial computing, only the people like secretaries, data entry, or code monkeys would require having no skill of the 10-finger typing whilst bosses only use 1 finger or 2 fingers to type in.

Using 1 finger or 2 fingers are the hardest way to type, because, firstly, you need to clear the field. You need a clear view of keyboard, so you can locate the key you are about to press. Imagining that there are around 100 keys on the board, you need to spot them as fast as you can. And for aged keyboard, the printing would have worn out, you may have to guess which one is correct key, this only demonstrate how skillful of 1-or-2-finger typer is.

After a desired key is spotted, the typer's motor skill kicks in, he or she presses the key with quick and precise press, then immediately retreats from the keyboard.

At this moment, the task hasn't been completed, far from it. The typer must change the focus from keyboard to monitor, the typer's eyes have to adapt such dramatic change in a short time. Seeking for the new typed letter on the screen where the cursor at which typer has to remember before typing.

Once confirmed, then repeat the process. As you can see, 1-or-2-finger is more skillful than 10-finger typer, who only stares at the screen all the time and type like nothing, there is no skill in them.

After the mouse device invented, till today, it's basically a three-button pointer device. Some 1-or-2-finger typers evolve into more skillful, then now may only use only one hand to control the whole computer, which is the same hand control the mouse.

Using a mouse is even harder then typing. The main goal is to find the Next button on the screen, which isn't always on the screen or it could be hard to be seen. Often times, the mouse users have to combine different types of controlling skills, suck as double-click, right-click, drag-and-drop, etc. On keyboard, it's only one, press.

Later, the mobile devices has become more important for one's life. Screen gets smaller as well as the keyboard, physical or virtual, the typer's skill is forced to evolve once more.

Then, the groundbreaking device was invented, the mighty iPhone and later iPad. Typer is no longer a typer, but a toucher. With only one finger to operate, as you can guess, it requires smarter people to use such devices. They have to learn new gestures in order to operate, such as slide. Fortunately, multi-touch is added later for people less skilled, who used to 10-finger typing.

Not only the devices evolve, but also the web. Less than a year ago, Google began a process to unify its products' designs starting with its navigation bar, Gmail, Google Groups, Google Reader, and more has already adopted. Things got bigger, such as buttons, that's good for toucher to touch. The looks is more like designs on *pad, like a millions of websites have already imitated *pad design. A more recent example is the Google+: Toward a simpler, more beautiful Google.

Google was wrong about it, it never was simpler. Such design requires users' long-trained skills. For users who have trained themselves from 10-finger typing to 1-finger touching, then it is simpler. But for newcomers, it requires a long period of training. Unfortunately, for less skilled 10-finger typer, the old aged and stubborned users who just won't die, but there is no resolution for them.

In foreseeable future, human may no longer need any fingers to operate a computer. However, it requires more skillful person to operate without a finger as it already is proved with decades of computing history as described above.

I just read this news about Twitter filed a lawsuit against five spammers. This post title explains what I had thought when I read the news, but according to The Guardian, Facebook and Google have already done the same thing with successes. I don't have an eye on Facebook, but I am pretty sure I didn't read any news about that from Google's blogs.

Twitter posted about this on its blog, which I hadn't read for about two years. I like that, make a strong stand. Clearly showing their action towards spammers.

Go Twitter! Sue the hell out of those disgusting people. Wasting their skill, coding programs that abuse other people and systems. The governments should try to build up criminal cases against those people! Darn it!

I never understand companies like Google who doesn't try to show and to execute their principles of how they deal with spamming issue. I see nothing from Google, occasional blog posts do not really count. If you have read Google products' Google Groups, you can often see spams being dealt only after days, sometimes, they just flush out of first page of posts and everyone forgets about that.

Indeed, there is a flag button. But that's your product's supporting forum. Okay, community support! Such a shame, using customers to help other customers. Some of those contributors don't even know what they are talking. And the blogs of theirs which still allow commenting, you often see well-designed spams slip in, passing Blogger's spam filter. Which is fine, the filter can't be 100%-effective, but you should have someone really read comments! Hire more people to run the forums and blogs, Google!

No one would think you are a bad company when you take on spammers. You are terrible when you are quiet about it even you are doing something behind. Just yell it loudly, for garsh's sake!

Sue them, then donate the money, that's how you boost your public image.

Note

The site is dead and some links have been removed. (2015-12-14T06:43:25Z)

Yesterday, I was looking into an old stuff I created using Google App Engine, which was evolved from a Bash script back in December, 2009. Here is a screenshot:

http://3.bp.blogspot.com/-Y4N6Vh1IwOw/T2w65ZZX0wI/AAAAAAAADDs/5mAnSjXViIs/s800/2012-03-23--16:52:45.png

Yes, I shamelessly Google/Yahoo/Bing my username and yes, I unblushingly made a record for the number of search results with a chart. (Cough, two charts and I have this.)

When I opened that page, I noticed the records stopped on October 3, 2011. At first, I thought it might be the result fetching limit in my program, so I went to update the code, but I realized it was not like that.

Something went wrong when I saw a huge amount of unsuccessful tasks in task queue. Retry counts were around thousands. I looked into the logs and found out the problem was with Yahoo Search BOSS API. The domain of v1 API is gone.

So, I googled and found this announcement for v2 API, v1 was scheduled to be shut down on July 20, 2011, but it lasted until August 27. But then on September 20, it was back and lasted until October 3, then it is gone for sure since then.

Because my program requires three API returns successfully before it writes data into datastore. I should check tasks retry count, it is too high, then drop it and send me an email. But I didnt code it in that way, because I didnt think there is a chance of having the thousands of retries until yesterday. I lost about 6 months of data from Google and Bing. Will I add the email notification for something like this? Nah.

Yahoo Search BOSS v1 is not only one gone, soon Google Web Search API will be shut down around 2013. It was declared as deprecated before Yahoos, but it has longer transition time for developers, which is 3 years.

I think they both move to paid version of API, v2 and Custom Search. I dont use API for making money, so when they are gone, my program stops updating.

I was watching live stream and read the chat alongside, everything seemed to be normal until I couldn't open new websites. I had a script to get updates from all sorts of services also having trouble.

Although the stream was still playing smoothly and chat message was still coming in, but I can't ping with domain name, so I decided to restart my PPP.

Well, it's still didn't work. I noticed I could ping with IP address. I could also ping DNS with IP and resolve domain name with @dns_ip. Somehow, DNS didn't work normally, which are my ISP's DNS.

I even checked route table, it's good.

Restarting the ADSL router as the last hope, guessing that might be just some upgrading glitch in ISP's end. Well, it didn't work.

Since, I knew the problem was with DNS. I thought I could just replace with other DNS. The funny thing was I didn't have any IP address of other DNS.

Luckily, since I could @dns_ip to lookup Google's IP, so I could still use Google's IP to do search. Thank god! Amazingly, Google seems to be careful about such situation, I saw everything in search result. The Instant search was reacting to my typing, images were loaded normally. Everything was accessible via that single IP address I used to connect. (Google Search has many IPs)

The only one I knew I could probably trust and it's open to public access is the OpenDNS. I can't just click on the result to go to the website since it requires DNS to resolve for the IP address. But I managed to get the IP address displayed in plain search result with right keyword.

So, this tells me: you better to keep a list of backup DNS.


I have seen this type of spam long ago, but didn't mention it.

Blogger keeps adding new features just like other Google product/service. But leaves plenty of issues behind, you go open a issue tracker, 90% you will see lots of issues are damn old and unresolved.

As a normal user and someone who can code, I am really sick of it. It's not like you can fix on your end, they are server-side things. You can workaround, you can pretend you didn't see it, but you know the issues are still there.

Do they really use their own products? They must be aware of it.

There is no doubt that Google creates a lot of good stuff, but also no denial about this: they also close/kill/terminate/abandon good stuff.

Google, assign some developers for this kind of thing and old issues! Hire some relationship managers like Jen and bridge API users (the developers) and your developers. You don't need to deal with normal users since you have lots of so-called "experts" users on your help forums already.

But it's Google, I am just nobody, not someone who tweets would get hundreds retweets in an hour. Not someone who whispers would cause one hundred follow-up comments.

So, I would just suck it up and leave it rusty in the dungeon behind the red door.


PURGE THEM!

Darn useless results and screw you shameless net garbage generator trash website owners.

Frak! Screw them!

Where is the quality you keep blah blah blah us?

After the Google+ project, today we are happily announcing Blogger-! See the sneak preview of first feature below:


-Views: it means even more than 939,291,924

Every Blogger blogger will have exciting statistics for your posts, that is the view count you could never have before, -1!

-Comments: a better way to manage your comments

No more spams when you enable this new feature. And the comment box will be automatically disappeared from your blog, isn't that sweet?

-Square: shape your thoughts, with people matter less

A new place allows you to post thoughts in a newly designed square style page. Am innovative idea from our developers, we make page square, you shape your thought.

-Voids: creating an emptiness in discussion when you need it

This will help you a lot when your posts attract too much attention. By turning on it, you can ensure no more unnecessary attentions. Best of all, it's by post basis.

-Hangovers: beer/boozes delivery in 15 mins, delivery guarantee!

Feel frustration of blogging? Unsatisfied? Need to get wasted temporarily? You can even purchase best hangover solutions with small additional fee. Yes, we care about you!

-Immobility: limiting where you can blog and when!

Get addicted to blogging? Interfering your daily work?

Don't worry, your boss will not see you logging on Blogger with this feature on. You will be completely blocked from your account depending on your location and the time.

-You: personal opinion filter

The new editor will provide a new feature to filter out your personal opinions just as the spelling checker does, it even suggests better wording. You will become a part of our collective!

-+-+-: mpmpm

We have no idea what this feature is. Buy hey, you can have it!

++++++++++[>+++++++>+++++++++++>+++++++++++>++++<<<<-]>----.>--.>+.<-----..--.>+++.>+++++.

Again we have no *beep* idea what this is.


I hate that red thing!

Yes, I have Google+ account, so? Even I did use Google+, I would not want that red thing on my navigation bar. I bet there are 1,312 apps already on every operating system, on every pad, on every phone, in every web browser as addon, or in every app store (or even in your toilet while you are taking care of smelly business), which allow you to get notified about your... err... I have no idea what you can get notified. Because I don't use that cross, eh... I mean Plus.

I have been waiting patiently to see if Google will add an option in Account settings, but still nothing. I guess you Google+ users love that red thing.

I would like to use Evanesco, but I am out of practice. So, I used the best tool I have, instead, Vimperator, to fix that thing.


I could have just hidden the whole navigation bar, but I did use that gear icon a few times, so I kept the bar in minimum.

Side note: if you look closer, you could see Groups and Gmail have fixed width text. But that is another story and here is another compliant about Blogger's editor's HTML mode: Was that a designer who made a call of HTML mode's font? It's as if so many artists dislike Comic Sans and we programmers dislike variant width text when you are dealing with codes. It feels just as if you open Word to write a C code, so wrong!

Anyway, this is not really fixing anything, it just hides stuff. The script to retrieve the number of notifications still runs in background. And that is another point I want to make, that number is updated a least a second later or longer, how could you possibly to want that snail to deliver you so important social interactive responding interconnection index. Get a desktop notification app. (The snail: who's summoning me? Oh boy, a second please, I need to take a breath first.)

After fixing that thing, I found an interesting side-effect:


I kind of like this new Google homepage, so I didn't try to fix the fix.


I was trying to read an issue from a link, but Google Code Issue Trackers seemed to be down.

I have seen this error page many many times, it's quite general on Google's products. But I have never really read it.

There is an error, but I am not talking about Error 502.

Can you find it?

Sorry, no prize, just for fun.

Either Google employees are elite or it just is, well, a tiny error.

I was going to take a screenshot and crop out important part, and think I might upload to Picasa Web Albums. As I recall, it doesn't count the quota if the size is smaller than 800 pixels. (I know Flickr has bigger size for free uploads, but I always feel I shouldn't upload screenshots to Flickr.)

I wanted to make sure so I googled and I was amazed that free quota for images it's now up to 2048 by 2048 in size. Hooray!

But, wait! Read carefully! It's eligible only if you are a Google+ user, or it's still 800x800.

So, I just signed up and linked to my Picasa Web Albums account, whatever it asked. A couple of clicks, I saw the first page of Google+, then I closed it. That's all my experience about Google+.

When Google+ just got released, immediately people started to compare it with Facebook. Face, what? Never really understand Facebook and that makes me have same feeling for Google+.

Anyway, if you want free 2048x2048 uploads, go sign up for Google+. Now I am sure I will upload all my screenshots to Picasa Web Albums.   But normal photos will still go to my free Flickr account even they only are showed in up to 1024x1024.

I am sure more and more free Flickr account holders who upload screenshot will start to use Picasa Web Albums for their screenshots. Generally, good screenshots have to be pixel-by-pixel, not to be resized. 2048 pixels is more than enough for most of people.