G+ arhiva
[1]      «      169   |   170   |   171   |   172   |   173   |   174   |   175   |   176   |   177      »      [181]

I hope they do it sooner rather than later. There <b>was</b> that genuine lea...
- 2012-08-17T20:12:12+0000 - Updated: 2012-08-17T20:12:12+0000
I hope they do it sooner rather than later. There was that genuine leak on geekbench about the time of retinabook release. I certainly won't buy the one without USB3 and with a full-glossy display.

iMac 2012 leaving factories in August

Shared with: Public

I recommend caution; I went against common wisdom of &quot;don&#39;t fix it i...
- 2012-08-16T15:34:45+0000 - Updated: 2012-08-16T15:34:45+0000
I recommend caution; I went against common wisdom of "don't fix it if it ain't broken" and upgraded from snow leopard. Bad mistake. I had to download new version of xcode, recompile all macports, acdsee stopped working and most of the interface is bloated with animations I had to turn off. Also, it no longer wakes by swipe across the touchpad and show-desktop gesture is all wrong. I'm learning to live with it now but if only I knew...
Shared with: Public
- 2012-08-17T08:48:17+0000 - Updated: 2012-08-17T08:48:37+0000
Yes, side effects that suck. Either go through the process or Time Machine warp back. Good luck.

I can&#39;t think of anything I&#39;d use this for but I find it to be <b>so<...
- 2012-08-05T06:47:43+0000 - Updated: 2012-08-05T06:47:43+0000
I can't think of anything I'd use this for but I find it to be so cool. The reason why I'm not so sure about it is that any smartphone already is such a computer, other than the monitor connection.
Shared with: Public
- 2012-08-05T08:00:52+0000 - Updated: 2012-08-05T08:03:09+0000
I see potential for a wearable computer every time I see one of those. Maybe even a controller for kinds of machinery where Android could provide a more elegant environment to develop/run control software on. In the past I tended to even think of it as a low power 24/7 home server.

But, your second sentence seems spot-on, in a way the question is if there will be enough space on the marketplace for "headless" smartphones. I believe there will be.
- 2012-08-05T10:16:59+0000
I don't see why the next generation of smartphones wouldn't have either thunderbolt or some kind of a wireless monitor connection that would turn it into a desktop computer on demand, or at least connect it to a projector. Thunderbolt is in fact great because it encapsulates the entire PCI so you can in fact have the smartphone as a module that plugs into a placeholder on a laptop or, I don't know, a cradle that connects to all other peripherals, similar to the way thunderbolt display serves as a hub. 

This thing, a computer without peripherals, is more-less redundant because cellphones already have similar power but have integrated battery and touchscreen.

I think this is in fact the way to go:

It just needs a standard thunderbolt docking connector.
- 2012-08-18T14:55:22+0000
To extrapolate slightly, that kind of thinking points to a paradigm in which all computer form factors below desktop size could be just "shells": tablet shells, handheld shells, laptop shells, wearable shells, surface shells, etc. in which you just plug in the "computing module" such as mini PC we are discussing.
But who knows how the future of the industry will pan out. As an idea it is already catching on though (I immediately initially thought of Motorola Atrix).

It seems the majority of people today are actually considering these kinds of devices as a low cost PC's for low power "light computing".
- 2012-08-18T18:28:11+0000
It might go that way in case of some devices, but I would expect a "computer module" to be such a generic and cheap part of the whole device as to be expendable, ie. it would be cheaper to just build it into the device than to engineer fail-proof connections. What I'd like to see is what Gentry Lee talked about in his SF books - modular, scalable computing, which is possible only with a super-fast bus which we have now with thunderbolt. I mean connecting several machines to the bus and getting an instantly available cluster, which would work on the app-server level, ie. you'd have app server running on each machine, and they would seamlessly connect and utilize all CPU and RAM resources of connected physical machines to run threads.
- 2012-08-18T19:14:40+0000 - Updated: 2012-08-18T19:15:40+0000
http://xi3.com was in the news a few years ago - they seem to be on the forefront of those kinds of notions. They've even optimized the whole approach by inventing a new form of computer by dividing "motherboard" in three parts.
- 2012-08-18T21:35:20+0000
I'm more interested in software part of the problem; the hardware part is reasonably straightforward and consists of having CPUs connected to a fast bus. Writing software for load distribution is more difficult, especially if you don't go the BOINC route of distributing huge workloads through narrow bandwidth. A more useful route is something on the line of GPU computing (like Open CL), where you atomize your code and distribute small workloads like functions and their data, expecting quick response. Instead of just using your GPU or GPU core, you can use all the CPUs that are connected to the thunderbolt bus. All the caveats of GPU computing apply; there's a certain amount of overhead from copying data back and forth over the bus so you don't do this unless it really makes sense or you could actually make your machine slower. 
What this could achieve in practice is that you plug your iphone and ipad into their docks, plug the laptop into the dock for syncing, and your desktop computer's geekbench score increases for the sum of additional devices' scores minus the cost of transport overhead.
That doesn't do much if a device's geekbench is 600 (iphone), but if it's 11000 (retinabook) then it could kick some serious ass.
- 2012-08-18T21:47:50+0000
Oh yes, one possible benefit comes to mind: if all the computers in an office building are plugged into such a bus, you could use the resources on demand from a workstation that requires more CPU at a point in time, so it would rev up all the computers in the building to crunch the numbers faster instead of having 99% of the computers idling and 1% needing a MBO/CPU change to get more power. This would be similar to BOINC but more useful to the normal users and not just major research institutions with huge projects that take years. This way one could juice all the computers in one's corporate network on demand - when someone renders a video, every semi-useless machine in the network processes his frames.

This is very cool, all school kids should have one of those.
- 2012-08-03T12:30:57+0000 - Updated: 2012-08-03T12:30:57+0000
This is very cool, all school kids should have one of those.

Lenovo unveils toughened ThinkPad X131e for education, hikes price to $499

Shared with: Public

The moral of the story seems to be: &quot;don&#39;t ever drop any of them on ...
- 2012-08-02T15:05:46+0000 - Updated: 2012-08-02T15:05:46+0000
The moral of the story seems to be: "don't ever drop any of them on the concrete, they're mostly screen, => breakable".
Shared with: Public

Google+ post
- 2012-07-31T19:49:46+0000 - Updated: 2012-07-31T19:49:46+0000
Shared with: Public

Google+ post
- 2012-07-31T19:49:12+0000 - Updated: 2012-07-31T19:49:12+0000
Shared with: Public

I sure don&#39;t know why mac&#39;s crontab doesn&#39;t ask me which editor I...
- 2012-06-29T08:52:38+0000 - Updated: 2012-06-29T08:52:38+0000
I sure don't know why mac's crontab doesn't ask me which editor I prefer, the way linux's does. I also don't know why vi, of all things, is the default, but this fixed it after I tried setenv to no avail.

Change & Set the Default crontab Editor

Shared with: Public

So much about early warning.
Shared with: Public

Google+ post
Shared with: Public

[1]      «      169   |   170   |   171   |   172   |   173   |   174   |   175   |   176   |   177      »      [181]