• The Evolution of Direct3D

    Comments Off on The Evolution of Direct3D
    May 30, 2020 /  Computer Technology, Programming

    * UPDATE: Be sure to read the comment thread at the end of this blog, the discussion got interesting.

    It’s been many years since I worked on Direct3D and over the years the technology has evolved Dramatically. Modern GPU hardware has changed tremendously over the years Achieving processing power and capabilities way beyond anything I dreamed of having access to in my lifetime. The evolution of the modern GPU is the result of many fascinating market forces but the one I know best and find most interesting was the influence that Direct3D had on the new generation GPU’s that support Welcome to Thunderbird processing cores, billions of transistors more than the host CPU and are many times faster at most applications. I’ve told a lot of funny stories about how political and Direct3D was created but I would like to document some of the history of how the Direct3D architecture came about and the architecture that had profound influence on modern consumer GPU’s.

    Published here with this article is the original documentation for Direct3D DirectX 2 when it was first Introduced in 1995. Contained in this document is an architecture vision for 3D hardware acceleration that was largely responsible for shaping the modern GPU into the incredibly powerful, increasingly ubiquitous consumer general purpose supercomputers we see today.

    D3DOVER
    The reason I got into computer graphics was NOT an interest in gaming, it was an interest in computational simulation of physics. I Studied 3D at Siggraph conferences in the late 1980’s Because I wanted to understand how to approach simulating quantum mechanics, chemistry and biological systems computationally. Simulating light interactions with materials was all the rage at Siggraph back then so I learned 3D. Understanding light 3D mathematics and physics made me a graphics and color expert roomates got me a career in the publishing industry early on creating PostScript RIP’s (Raster Image Processors). I worked with a team of engineers in Cambridge England creating software solutions for printing color graphics screened before the invention of continuous tone printing. That expertise got me recruited by Microsoft in the early 1990’s to re-design the Windows 95 and Windows NT print architecture to be more competitive with Apple’s superior capabilities at that time. My career came full circle back to 3D when, an initiative I started with a few friends to re-design the Windows graphics and media architecture (DirectX) to support real-time gaming and video applications, resulted in gaming becoming hugely strategic to Microsoft. Sony Introduced in a consumer 3D game console (the Playstation 1) and being responsible for DirectX it was incumbent on us to find a 3D solution for Windows as well.

    For me, the challenge in formulating a strategy for consumer 3D gaming for Microsoft was an economic one. What approach to consumer 3D Microsoft should take to create a vibrant competitive market for consumer 3D hardware that was both affordable to consumers AND future proof? The complexity of realistically simulating 3D graphics in real time was so far beyond our capabilities in that era that there was NO hope of choosing a solution that was anything short of an ugly hack that would produce “good enough” for 3D games while being very far removed from the ideal solutions mathematically we had implemented a little hope of seeing in the real-world during our careers.

    Up until that point only commercial solutions for 3D hardware were for CAD (Computer Aided Design) applications. These solutions worked fine for people who could afford hundred thousand dollars work stations. Although the OpenGL API was the only “standard” for 3D API’s that the market had, it had not been designed with video game applications in mind. For example, texture mapping, an essential technique for producing realistic graphics was not a priority for CAD models roomates needed to be functional, not look cool. Rich dynamic lighting was also important to games but not as important to CAD applications. High precision was far more important to CAD applications than gaming. Most importantly OpenGL was not designed for highly interactive real-time graphics that used off-screen video page buffering to avoid tearing artifacts during rendering. It was not that the OpenGL API could not be adapted to handle these features for gaming, simply that it’s actual market implementation on expensive workstations did not suggest any elegant path to a $ 200 consumer gaming cards.

    TRPS15In the early 1990’s computer RAM was very expensive, as such, early 3D consumer hardware designs optimized for minimal RAM requirements. The Sony Playstation 1 optimized for this problem by using a 3D hardware solution that did not rely on a memory intensive the data structure called a Z-buffer, instead they used a polygon level sorting algorithm that produced ugly intersections between moving joints. The “Painters Algorithm” approach to 3D was very fast and required little RAM. It was an ugly but pragmatic approach for gaming that would have been utterly unacceptable for CAD applications.

    In formulating the architecture for Direct3D we were faced with difficult choices Similar enumerable. We wanted the Windows graphics leading vendors of the time; ATI, Cirrus, Trident, S3, Matrox and many others to be Able to Compete with one another for rapid innovation in 3D hardware market without creating utter chaos. The technical solution that Microsoft’s OpenGL team espoused via Michael Abrash was a driver called 3DDDI models (3D Device Driver Interface). 3DDDI was a very simple model of a flat driver that just supported the hardware acceleration of 3D rasterization. The complex mathematics associated with transforming and lighting a 3D scene were left to the CPU. 3DDDI used “capability bits” to specify additional hardware rendering features (like filtering) that consumer graphics card makers could optionally implement. The problem with 3DDDI was that it invited problems for game developers out of the gate. There were so many cap-bits every game that would either have to support an innumerable number of feature combinations unspecified hardware to take advantage of every possible way that hardware vendors might choose to design their chips producing an untestable number of possible hardware configurations and a consumer huge amount of redundant art assets that the games would not have to lug around to look good on any given device OR games would revert to using a simple set of common 3D features supported by everyone and there would be NO competitive advantage for companies to support new hardware 3D capabilities that did not have instant market penetration. The OpenGL crowd at Microsoft did not see this as a big problem in their world Because everyone just bought a $ 100,000 workstation that supported everything they needed.

    The realization that we could not get what we needed from the OpenGL team was one of the primary could be better we Decided to create a NEW 3D API just for gaming. It had nothing to do with the API, but with the driver architecture underneath Because we needed to create a competitive market that did not result in chaos. In this respect the Direct3D API was not an alternative to the OpenGL API, it was a driver API designed for the sole economic purpose of creating a competitive market for 3D consumer hardware. In other words, the Direct3D API was not shaped by “technical” requirements so much as economic ones. In this respect the Direct3D API was revolutionary in several interesting ways that had nothing to do with the API itself but rather the driver architecture it would rely on.

    When we Decided to acquire a 3D team to build with Direct3D I was chartered surveying the market for candidate companies with the right expertise to help us build the API we needed. As I have previously recounted we looked at Epic Games (creators of the Unreal engine), Criterion (later acquired by EA), Argonaut and finally Rendermorphics. We chose Rendermorphics (based in London) Because of the large number of 3D quality engineers and the company employed Because The founder, Servan Kiondijian, had a very clear vision of how consumer 3D drivers should be designed for maximum future compatibility and innovation. The first implementation of the Direct3D API was rudimentary but quickly intervening evolved towards something with much greater future potential.

    D3DOVER lhanded
    Whoops!

    My principal memory from that period was a meeting in roomates I, as the resident expert on the DirectX 3D team, was asked to choose a handedness for the Direct3D API. I chose a left handed coordinate system, in part out of personal preference. I remember it now Only because it was an arbitrary choice that by the caused no end of grief for years afterwards as all other graphics authoring tools Adopted the right handed coordinate system to the OpenGL standard. At the time nobody knew or believed that a CAD tool like Autodesk would evolve up to become the standard tool for authoring game graphics. Microsoft had acquired Softimage with the intention of displacing the Autodesk and Maya anyway. Whoops …

    The early Direct3D HAL (Hardware Abstraction Layer) was designed in an interesting way. It was structured vertically into three stages.

    DX 2 HAL

    The highest was the most abstract layer transformation layer, the middle layer was dedicated to lighting calculations and the bottom layer was for rasterization of the finally transformed and lit polygons into depth sorted pixels. The idea behind this vertical structure driver was to provide a relatively rigid feature path for hardware vendors to innovate along. They could differentiate their products from one another by designing hardware that accelerated increasingly higher layers of the 3D pipeline resulting in greater performance and realism without incompatibilities or a sprawling matrix of configurations for games to test against art or requiring redundant assets. Since the Direct3D API created by Rendermorphics Provided a “pretty fast” implementation software for any functionality not accelerated by the hardware, game developers could focus on the Direct3D API without worrying about myriad permutations of incompatible hardware 3D capabilities. At least that was the theory. Unfortunately like the 3DDDI driver specification, Direct3D still included capability bits designed to enable hardware features that were not part of the vertical acceleration path. Although I actively objected to the tendency of Direct3D capability to accumulate bits, the team felt extraordinary competitive pressure from Microsoft’s own OpenGL group and from the hardware vendors to support them.

    The hardware companies, seeking a competitive advantage for their own products, would threaten to support and promote OpenGL to game developers Because The OpenGL driver bits capability supported models that enabled them to create features for their hardware that nobody else supported. It was common (and still is) for the hardware OEM’s to pay game developers to adopt features of their hardware unique to their products but incompatible with the installed base of gaming hardware, forcing consumers to constantly upgrade their graphics card to play the latest PC games . Game developers alternately hated capability bits Because of their complexity and incompatibilities but wanted to take the marketing dollars from the hardware OEM’s to support “non-standard” 3D features.

    Overall I viewed this dynamic as destructive to a healthy PC gaming economy and advocated resisting the trend OpenGL Regardless of what the people wanted or OEM’s. I believed that creating a consistent stable consumer market for PC games was more important than appeasing the hardware OEM’s. As such as I was a strong advocate of the relatively rigid vertical Direct3D pipeline and a proponent of introducing only API features that we expected up to become universal over time. I freely confess that this view implied significant constraints on innovation in other areas and a placed a high burden of market prescience on the Direct3D team.

    The result, in my estimation, was pretty good. The Direct3D fixed function pipeline, as it was known, produced a very rich and growing PC gaming market with many healthy competitors through to DirectX 7.0 and the early 2000’s. The PC gaming market boomed and grew to be the largest gaming market on Earth. It also resulted in a very interesting change in the GPU hardware architecture over time.

    Had the Direct3D HAL has been a flat driver with just the model for rasterization capability bits as the OpenGL team at Microsoft had advocated, 3D hardware makers would have competed by accelerating just the bottom layer of the 3D rendering pipeline and adding differentiating features to their hardware capability via bits that were incompatible with their competitors. The result of introducing the vertical layered architecture THING that was 3D hardware vendors were all encouraged to add features to their GPU’s more consistent with the general purpose CPU architectures, namely very fast floating point operations, in a consistent way. Thus consumer GPU’s evolved over the years to increasingly resemble general purpose CPU’s … with one major difference. Because the 3D fixed function pipeline was rigid, the Direct3D architecture afforded very little opportunity for code branching frequent as CPU’s are designed to optimize for. Achieved their GPU’s amazing performance and parallelism in part by being free to assume that little or no branching code would ever occur inside a Direct3D graphics pipeline. Thus instead of evolving one giant monolithic core CPU that has massive numbers of transistors dedicated to efficient branch prediction has as an Intel CPU, GPU has a Direct3D Hundreds to Welcome to Thunderbird simple CPU cores like that have no branch prediction. They can chew through a calculation at incredible speed confident in the knowledge that they will not be interrupted by code branching or random memory accesses to slow them down.

    DirectX 7.0 up through the underlying parallelism of the GPU was hidden from the game. As far as the game was concerned some hardware was just faster than other hardware but the game should not have to worry about how or why. The early DirectX fixed function pipeline architecture had done a brilliant job of enabling dozens of Disparate competing hardware vendors to all take different approaches to Achieving superior cost and performance in consumer 3D without making a total mess of the PC gaming market for the game developers and consumers . It was not pretty and was not entirely executed with flawless precision but it worked well enough to create an extremely vibrant PC gaming market through to the early 2000’s.

    Before I move on to discussing more modern evolution Direct3D, I would like to highlight a few other important ideas that influenced architecture in early modern Direct3D GPU’s. Recalling that in the early to mid 1990’s was relatively expensive RAM there was a lot of emphasis on consumer 3D techniques that conserved on RAM usage. The Talisman architecture roomates I have told many (well-deserved) derogatory stories about was highly influenced by this observation.

    Talsiman
    Search this blog for tags “Talisman” and “OpenGL” for many stories about the internal political battles over these technologies within Microsoft

    Talisman relied on a grab bag of graphics “tricks” to minimize GPU RAM usage that were not very generalized. The Direct3D team, Rendermorphics Heavily influenced by the founders had made a difficult choice in philosophical approach to creating a mass market for consumer 3D graphics. We had Decided to go with a more general purpose Simpler approach to 3D that relied on a very memory intensive a data structure called a Z-buffer to Achieve great looking results. Rendermorphics had managed to Achieve very good 3D performance in pure software with a software Z-buffer in the engine Rendermorphics roomates had given us the confidence to take the bet to go with a more general purpose 3D Simpler API and driver models and trust that the hardware RAM market and prices would eventually catch up. Note however that at the time we were designing Direct3D that we did not know about the Microsoft Research Groups “secret” Talisman project, nor did they expect that a small group of evangelists would cook up a new 3D API standard for gaming and launch it before their own wacky initiative could be deployed. In short one of the big bets that Direct3D made was that the simplicity and elegance of Z-buffers to game development were worth the risk that consumer 3D hardware would struggle to affordably support them early on.

    Despite the big bet on Z-buffer support we were intimately aware of two major limitations of the consumer PC architecture that needed to be addressed. The first was that the PC bus was generally very slow and second it was much slower to copy the data from a graphics card than it was to copy the data to a graphics card. What that generally meant was that our API design had to growing niche to send the data in the largest most compact packages possible up to the GPU for processing and absolutely minimize any need to copy the data back from the GPU for further processing on the CPU. This generally meant that the Direct3D API was optimized to package the data up and send it on a one-way trip. This was of course an unfortunate constraint Because there were many brilliant 3D effects that could be best accomplished by mixing the CPU’s branch prediction efficient and robust floating point support with the GPU’s parallel rendering incredible performance.

    One of the fascinating Consequences of that constraint was that it forced the GPU’s up to become even more general purpose to compensate for the inability to share the data with the CPU efficiently. This was possibly the opposite of what Intel intended to happen with its limited bus performance, Because Intel was threatened by the idea that the auxiliary would offload more processing cards from their work thereby reducing the CPU’s Intel CPU’s value and central role to PC computing. It was reasonably believed at that time that Intel Deliberately dragged their feet on improving PC performance to deterministic bus a market for alternatives to their CPU’s for consumer media processing applications. Earlier Blogs from my recall that the main REASON for creating DirectX was to Prevent Intel from trying to virtualize all the Windows Media support on the CPU. Intel had Adopted a PC bus architecture that enabled extremely fast access to system RAM shared by auxiliary devices, it is less Likely GPU’s that would have evolved the relatively rich set of branching and floating point operations they support today.

    To Overcome the fairly stringent performance limitations of the PC bus a great deal of thought was put into techniques for compressing and streamlining DirectX assets being sent to the GPU performance to minimize bus bandwidth limitations and the need for round trips from the GPU back to the CPU . The early need for the rigid 3D pipeline had Consequences interesting later on when we Began to explore assets streaming 3D over the Internet via modems.

    We Recognized early on that support for compressed texture maps would Dramatically improve bus performance and reduce the amount of onboard RAM consumer GPU’s needed, the problem was that no standards Existed for 3D texture formats at the time and knowing how fast image compression technologies were evolving at the time I was loathe to impose a Microsoft specified one “prematurely” on the industry. To Overcome this problem we came up with the idea of ​​”blind compression formats”. The idea, roomates I believe was captured in one of the many DirectX patents that we filed, had the idea that a GPU could encode and decode image textures in an unspecified format but that the DirectX API’s would allow the application to read and write from them as though they were always raw bitmaps. The Direct3D driver would encode and decode the image data is as Necessary under the hood without the application needing to know about how it was actually being encoded on the hardware.

    By 1998 3D chip makers had begun to devise good quality 3D texture formats by DirectX 6.0 such that we were Able to license one of them (from S3) for inclusion with Direct3D.

    http://www.microsoft.com/en-us/news/press/1998/mar98/s3pr.aspx

    DirectX 6.0 was actually the first version of DirectX that was included in a consumer OS release (Windows 98). Until that time, DirectX was actually just a family of libraries that were shipped by the Windows games that used them. DirectX was not actually a Windows API until five generations after its first release.

    DirectX 7.0 was the last generation of DirectX that relied on the fixed function pipeline we had laid out in DirectX 2.0 with the first introduction of the Direct3D API. This was a very interesting transition period for Direct3D for several could be better;

    1) The original founders DirectX team had all moved on,

    2) Microsoft’s internal Talisman and could be better for supporting OpenGL had all passed

    3) Microsoft had brought the game industry veterans like Seamus Blackley, Kevin Bacchus, Stuart Moulder and others into the company in senior roles.

    4) Become a Gaming had a strategic focus for the company

    DirectX 8.0 marked a fascinating transition for Direct3D Because with the death of Talisman and the loss of strategic interest in OpenGL 3D support many of the people from these groups came to work on Direct3D. Talisman, OpenGL and game industry veterans all came together to work on Direct3D 8.0. The result was very interesting. Looking back I freely concede that I would not have made the same set of choices that this group made for DirectX 8.0 in chi but it seems to me that everything worked out for the best anyway.

    Direct3D 8.0 was influenced in several interesting ways by the market forces of the late 20th century. Microsoft largely unified against OpenGL and found itself competing with the Kronos Group standards committee to advance faster than OpenGL Direct3D. With the death of SGI, control of the OpenGL standard fell into the hands of the 3D hardware OEM’s who of course wanted to use the standard to enable them to create differentiating hardware features from their competitors and to force Microsoft to support 3D features they wanted to promote. The result was the Direct3D and OpenGL Became much more complex and they tended to converge during this period. There was a stagnation in 3D feature adoption by game developers from DirectX 8.0 to DirectX 11.0 through as a result of these changes. Became creating game engines so complex that the market also converged around a few leading search providers Including Epic’s Unreal Engine and the Quake engine from id software.

    Had I been working on Direct3D at the time I would have stridently resisted letting the 3D chip lead Microsoft OEM’s around by the nose chasing OpenGL features instead of focusing on enabling game developers and a consistent quality consumer experience. I would have opposed introducing shader support in favor of trying to keep the Direct3D driver layer as vertically integrated as possible to Ensure conformity among hardware vendors feature. I also would have strongly opposed abandoning DirectDraw support as was done in Direct3D 8.0. The 3D guys got out of control and Decided that nobody should need pure 2D API’s once developers Adopted 3D, failing to recognize that simple 2D API’s enabled a tremendous range of features and ease of programming that the majority of developers who were not 3D geniuses could Easily understand and use. Forcing the market to learn 3D Dramatically constrained the set of people with the expertise to adopt it. Microsoft later discovered the error in this decision and re-Introduced DirectDraw as the Direct2D API. Basically letting the Direct3D 8.0 3D design geniuses made it brilliant, powerful and useless to average developers.

    At the time that the DirectX 8.0 was being made I was starting my first company WildTangent Inc.. and Ceased to be closely INVOLVED with what was going on with DirectX features, however years later I was Able to get back to my roots and 3D took the time to learn Direct3D programming in DirectX 11.1. Looking back it’s interesting to see how the major architectural changes that were made in DirectX 8 resulted in the massively convoluted and nearly incomprehensible Direct3D API we see today. Remember the 3 stage pipeline DirectX 2 that separated Transformation, lighting and rendering pipeline into three basic stages? Here is a diagram of the modern DirectX 11.1 3D pipeline.

    DX 11 Pipeline

    Yes, it grew to 9 stages and 13 stages when arguably some of the optional sub-stages, like the compute shader, are included. Speaking as somebody with an extremely lengthy background in very low-level 3D graphics programming and I’m Embarrassed to confess that I struggled mightily to learn programming Direct3D 11.1. Become The API had very nearly incomprehensible and unlearnable. I have no idea how somebody without my extensive background in 3D and graphics could ever begin to learn how to program a modern 3D pipeline. As amazingly powerful and featureful as this pipeline is, it is also damn near unusable by any but a handful of the most antiquated brightest minds in 3D graphics. In the course of catching up on my Direct3D I found myself simultaneously in awe of the astounding power of modern GPU’s and where they were going and in shocked disgust at the absolute mess the 3D pipeline had Become. It was as though the Direct3D API had Become a dumping ground for 3D features that every OEM DEMANDED had over the years.

    Had I not enjoyed the benefit of the decade long break from Direct3D involvement Undoubtedly I would have a long history of bitter blogs written about what a mess my predecessors had made of a great and elegant vision for the consumer 3D graphics. Weirdly, however, leaping forward in time to the present day, I am forced to admit that I’m not sure it was such a bad thing after all. The result of stagnation gaming on the PC as a result of the mess Microsoft and the OEMs made of the Direct3D API was a successful XBOX. Having a massively fragmented 3D API is not such a problem if there is only one hardware configuration to support game developers have, as is the case with a game console. Direct3D shader 8.0 support with early primitive was the basis for the first Xbox’s graphics API. For the first selected Microsoft’s XBOX NVIDIA NVIDIA chip giving a huge advantage in the 3D PC chip market. DirectX 9.0, with more advanced shader support, was the basis for the XBOX 360, Microsoft roomates selected for ATI to provide the 3D chip, AMD this time handing a huge advantage in the PC graphics market. In a sense the OEM’s had screwed Themselves. By successfully Influencing Microsoft and the OpenGL standards groups to adopt highly convoluted graphics pipelines to support all of their feature sets, they had forced Themselves to generalize their GPU architectures and the 3D chip market consolidated around a 3D chip architecture … whatever Microsoft selected for its consoles.

    The net result was that the retail PC game market largely died. It was simply too costly, too insecure and too unstable a platform for publishing high production value games on any longer, with the partial exception of MMOG’s. Microsoft and the OEM’s had conspired together to kill the proverbial golden goose. No biggie for Microsoft as they were happy to gain complete control of the former PC gaming business by virtue of controlling the XBOX.

    From the standpoint of the early DirectX vision, I would have said that this outcome was a foolish, shortsighted disaster. Microsoft had maintained a little discipline and strategic focus on the Direct3D API they could have ensured that there were NO other consoles in existence in a single generation by using the XBOX XBOX to Strengthen the PC gaming market rather than inadvertently destroying it. While Microsoft congratulates itself for the first successful U.S. launch of the console, I would count all the gaming dollars collected by Sony, Nintendo and mobile gaming platforms over the years that might have remained on Microsoft platforms controlled Microsoft had maintained a cohesive strategy across media platforms. I say all of this from a past tense perspective Because, today, I’m not so sure that I’m really all that unhappy with the result.

    The new generation of consoles from Sony AND Microsoft have Reverted to a PC architecture! The next generation GPU’s are massively parallel, general-purpose processors with intimate access to the shared memory with the CPU. In fact, the GPU architecture Became so generalized that a new pipeline stage was added in DirectX 11 DirectCompute called that simply allowed the CPU to bypass the entire convoluted Direct3D graphics pipeline in favor of programming the GPU directly. With the introduction of DirectCompute the promise of simple 3D programming returned in an unexpected form. Modern GPU’s have Become so powerful and flexible that the possibility of writing cross 3D GPU engines directly for the GPU without making any use of the traditional 3D pipeline is an increasingly practical and appealing programming option. From my perspective here in the present day, I would anticipate that within a few short generations the need for the traditional Direct3D and OpenGL APIs will vanish in favor of new game engines with much richer and more diverse feature sets that are written entirely in device independent shader languages ​​like Nvidia’s CUDA and Microsoft’s AMP API’s.

    Today, as a 3D physics engine and developer I have never been so excited about GPU programming Because of the sheer power and relative ease of programming directly to the modern GPU without needing to master the enormously convoluted 3D pipelines associated with Direct3D and OpenGL API’s. If I were responsible for Direct3D strategy today I would be advocating dumping the investment in traditional 3D pipeline in favor of Rapidly opening direct access to a rich GPU programming environment. I personally never imagined that my early work on Direct3D, would, within a couple decades, Contribute to the evolution of a new kind of ubiquitous processor that enabled the kind of incredibly realistic and general modeling of light and physics that I had learned in the 1980 ‘s but never believed I would see computers powerful enough to models in real-time during my active career.

    Tags: , , , ,

  • The Beginner’s Guide to

    Comments Off on The Beginner’s Guide to
    May 30, 2020 /  Internet Services

    The Existing Marketing Issues to Be aware of in 2020

    The year 2020 has many companies and individuals in the marketing industry experiencing a wide variety of challenges. The kind of crowds experienced in every economy and market here is the worst. The achievement of business success does not come easy especially with the very high level of competition caused b the low barrier to entry. Click here to read more now on some of the common factors challenging the marketers today from this homepage as it provides you with more info on the subject. You can learn about the challenges from the info elaborated here in this page and then view here for more on the reliable solutions that we have provided to most of the challenges such that the marketing sector of your organization will not fall prey to the current challenges that the entire marketing industry is facing in 2020.

    The first and foremost challenge that the marketers in 2020 are experiencing is the lack of adequate data analysis. Gone are the days when you would find organizations having insufficient data. Conversely, there is the availability of a plethora of digital marketing software programs and modern data harvesting facilities which make that to be possible. Not knowing how valuable all that data can be to business is the issue that many companies face and yet they have the info. in plenty. Leveraging supplementary technology such as machine learning can be a great solution in such a case. Machine learning will come in handy when the need to analyze large amounts of data arises for the primary objective of getting regression patterns.

    Apart from that, marketing departments are focusing more on standing out when it comes to highly competitive markets. As aforementioned, the internet operations simplify matters for entrepreneurs as starting up businesses is no longer a complicated process. Resultantly, overcrowding starts to happen due to the many offerings that they provide making it hard to get clients to shop from you. Humanizing your product brands becomes the ultimate solution under such circumstances. With technology having the side effect of difficulty in the humanization of the online brands, it means that you can only achieve to bring out the human aspect of your business by using the influences of storytelling as well as the leveraging of personal social media accounts of your personnel.

    Knowing that third world countries have economies that can for sure cheaply produce similar goods, it facilitates the arising of another common issue- the price wars within the marketing department. Do not just use the price to tell the difference-try better marketing instead.

  • Logitech headsets and webcams for the business professional

    Comments Off on Logitech headsets and webcams for the business professional
    May 29, 2020 /  Computer Technology, Hardware

    As many of you know, I’m a full-time telecommuter. Although a portion of my work involves some travel, most days I am working from home, and a lot of that involves sitting on conference calls with colleagues and customers/partners.

    Until recently, much of that required that I be desk-bound.

    Anyone who has to work with VOIP and IP-based conferencing systems such as Skype, Microsoft Lync, Cisco WebEx and Citrix GoToMeeting knows that voice quality is everything if you’re going to have an effective business conversation.

    And that means using devices that typically tie you to your desk, such as a wired headset or an Bluetooth/USB speakerphone, such as the Plantronics Callisto, which I have and think is an excellent product.

    While there are many Bluetooth headsets and earpieces on the market which are perfectly suitable for mobile phone conversations, few are specifically optimized for use with PCs that have VOIP “Soft Phone” software, and do not deliver what I would regard as business critical voice quality.

    They are perfectly fine for short calls, but not ideal when you are on a VOIP conference for as much as an hour at a time, or even longer, particularly when you need to be an active participant and when paying close attention to who is speaking and the clarity of what you are saying is essential.

    As we all know about Bluetooth when it comes to audio streams, the farther you get away from the transceiver, the worse the audio gets. So it’s not practical to stray too far away from your PC.

    Logitech’s latest wireless headsets have been a total game changer for my personal work situation since I’ve been using them the last few months. I’ve been using the H820e stereo version which retails for $199 but can be found for considerably less.

    Installation and use of the headset is pretty straightforward — you plug the DECT 6.0 transmitter and charging base into a free USB port on your PC or Mac, and the AC power cord to power the base. The headset charges on the base when not in use, and has a built-in rechargeable battery.

    The operating system recognizes it automatically, and depending on the VOIP program you are using, you may need to alter the settings to use the headset as your primary audio device.

    If you’re familiar with the DECT 6.0 1.9Ghz wireless transmission standard, particularly if you have cordless phones in your house that use the technology, you know that you can get some pretty impressive range and not lose any voice quality. That’s exactly what the H820e headset gives you for VOIP calls.

    My home office is a good 60 feet away from my living room and around 75 feet from my “breakfast area” which has my espresso machine and a table which faces my outdoor patio and pool area with outdoor furniture which is about 100 feet or so away from the base transmitter.

    So regardless of what VOIP software I am using, and where I am in my house, I get the same crystal-clear voice quality as if I am sitting right in front of my PC. For example, this wearable computing podcast that I recorded with Rick Vanover of Veeam was actually done in my living room, while wearing the H820e using Skype.

    So the quality of the audio is without dispute. What about the overall design and using it?

    The H820e was designed for use for hours at a time. The stereo version is comfortable and after a while you forget you even have it on your head. While I am extremely pleased with the device, I have only a few nitpicks:

    First, the “Mute” button is attached to the microphone boom and is recessed back towards where the headphone is. It doesn’t stick prominently out, so you have to sort of feel your way up the boom to finding it.

    If you’re away from your PC and are not near the software controls of your VOIP client, and some sort of unplanned audio distraction occurs that you don’t want to be heard by everyone else, then it could take a few seconds to mute the audio while you fumble around with the boom. It would be better if in the next version of this product that they put it on the exterior side of the headphone holding the boom.

    It’s a minor annoyance but it’s still an annoyance nonetheless.

    The second is the boom mic’s sensitivity to airflow. Now, normally you don’t have a lot of “wind” in an indoor or office setting but in the summertime in Florida, I like to have a fan going in my office for better air circulation.

    If that fan is pointed directly at me, it sounds like I am in an outdoor breeze. And if you are actually outdoors (like sitting on my patio and having a cup of coffee) and a little bit of wind picks up, you’re going to hear it if the mic isn’t muted, no question.

    Also, if you are a heavy breather, you’ll probably want to have the boom twisted a lot farther away from your mouth than you think you need it.

    Despite what I would call these two minor nitpicks I think the H820e is an excellent product and I heartily reccomend it. I’ve also spent some time with their wired headset, the H650e on business trips with my laptop and also on my Surface RT using Skype and Lync, and the audio is just as high quality as the H820e, provided your bandwidth supports the fidelity of the connection.

    Not all telecommuting and conferencing is about audio, however. From time to time I do need to do video as well.

    My corporate laptop, my Lenovo X1 Carbon is a great little machine but its webcam isn’t its strong suit. When it’s docked to my monitor on my desk at home, I need something that delivers much more robust and HD-quality video.

    I’ve written about small busines and SOHO/workgroup video conferencing products before, like Logitech’s BCC950. While the BCC950 is an excellent product for small meeting rooms and for having three to five people on camera at once, it’s overkill for a telecommuter or just someone in a single office.

    Enter the Logitech C930e, a “Business” webcam. Like any other webcam it clips to the top of your monitor and plugs into your USB 2.0 or 3.0 port. But this is no ordinary webcam.

    At a street price of $129.00 it’s more expensive than Logitech’s consumer/prosumer webcam offerings, but there’s considerable enterprise-class video conferencing technology built-into this little device.

    First, provided your bandwidth supports it, the C930e can capture 1080p video (or 15MP stills) at 30 frames a second because it includes Scalable Video Coding using H.264 and UVC 1.5, the second of which is needed to be certified for use with corporate-grade video conferencing tools.

    Second, the camera has a 90-degree diagonal field of view so you get a widescreen capture of the subject without any “fish eye” distortion. You also get a Carl Zeiss lens and 4X digital zoom with software pan and tilt control, as well as built-in stereo microphones

    Logitech also offers the consumer-oriented C920 which is about $30 cheaper than the C930e, but it lacks the the Scalable Video Coding and UVC 1.5 capabilities used with corporate applications like Lync and Cisco UC and is more suited towards Skype and other consumer video applications like Google Hangouts. It also lacks the 90-degree FOV of its more expensive sibling.

    While the two cameras look very similar, they shouldn’t be confused with each other. If corporate video conferencing capability and quality is definitely what you need, you want the C930e.

    Tags: , , , ,

  • Getting To The Point –

    Comments Off on Getting To The Point –
    May 29, 2020 /  Foods & Culinary

    What to Know When Choosing a Beer Bike

    Many people all over the globe prefer a bike as a means of transport. There are numerous reasons for this. One of them is because people want to keep fit since cycling is a sport that exercises the legs. Others opt to cycle since cycling will make them lose weight. Calories can help anyone lose many calories as long as he or she cycles at a speed of fourteen miles in one hour. The good thing about cycling is that anyone can do it regardless of age. This means that you do not have to be young or a teenager to cycle. Aged men and women can cycle as long as they have the strength to.

    People lose weight when cycling according to the proportion of their age as well as their weight. If you want to enjoy the benefits of riding a bike, all you have to do is to purchase. You will then be able to cycle all around town and lose some pounds. You might not have to spend thousands and thousands of monies to purchase a bike since you can decide to purchase the bike with other people who would like to purchase the bike. The good thing about the beer bike is that it accommodates up to fourteen people. Therefore, you do not have to incur the whole amount by yourself, but rather the group of people whom you have identified to love the spoke can make contributions to the bike. When you buy a bike as a group, you will find out that it will be very cheap.

    The whole group must buy a decent bike which will be able to satisfy the needs of the people who will be using it. Beer bikes can be quite expensive, and that is why you have to be very keen when buying one. When purchasing a beer bike, you must look at the size frame of the bike. The bike should be able to fit the body height as well as the sex of the people who will be riding it. Since different people will be using the bike, it vital for you to choose the right bike size. It is essential that the bike has at least a two-inch clearance between the frame of the bike and your crotch.

    You should also be keen on choosing the seat height of the bike. This is essential because choosing the right seat right will avoid any injury to the person who will be sitting on the bike. The bike seat should always be comfortable. One of the ways on which you can be sure that the bike seat is of the right size is by confirming that the legs of the rider are extended in the down position. The bike seat should ensure never make your feet touch the ground. If this is the case, it implies that the bike seat is low. You should also make sure that the seat is leveled and not inclined to one direction.

    The Beginners Guide To (Finding The Starting Point)

    The Beginners Guide To (Finding The Starting Point)

  • If You Think You Understand , Then This Might Change Your Mind

    Comments Off on If You Think You Understand , Then This Might Change Your Mind
    May 29, 2020 /  Arts & Entertainment

    Tips for Choosing the Right Life Transition Coaching Services

    It is a known fact that change is the only constant thing in life. We do go through various changes in life that shape us in unexpected ways. Experiences such as the loss of a loved one, betrayal, loss of job or career, trauma, miscarriage, terminal diagnosis to mention but a few do have a great impact on the way we lead our life since they all involve leaving behind something that once mattered to us. In simple words, change can involve some form of grief and this is where life transition coaching services come in. Through these services, you will get all the support you need to navigate such trying times and overcome the obstacles that are holding you back from achieving your goals. Finding a reliable life transition coach may however be quite daunting as such service providers are on a high increase in the market. You may, therefore, need to follow some tips as will be highlighted below to make the right choice.

    Training and certification are some of the crucial aspects you should look at when choosing a life transition coach. Life coaching is normally based on scientific research and evidence-based techniques. To learn this, coaches need to go through intense training programs. Considering that there is very little regulation of the coaching industry, meaning that anyone can call themselves a life coach, you should settle for one that is accredited by a recognized regulatory body such as the International Coach Federation. With such a life coach, you will be assured of exceptional services since they have undergone rigorous training and are well equipped with the right coaching models.

    When choosing a life transition coach, you should also consider their coaching style. Life coaching varies depending on the personality and training of the coach. Some coaching sessions are quite organized and well-structured while others are more open and free-flowing. You should, therefore, identify the coaching style that will best work for you to have an easy time selecting the right coach. You should also inquire about the methodologies and tools used by your preferred life coach before enlisting his or her services. A reputable life coach will utilize proven coaching techniques and methodologies to help you achieve your life transitioning goals.

    The other essential aspect you should look at when choosing a life transition coach is the reputation. When going through some changes such as grief, the last thing you would need is frustration from your life coach. You should, therefore, look for a life coach with exceptional interpersonal skills to be assured of a smooth transition. A reputable life coach will be accommodating to you, have great listening skills and also make you feel comfortable during your sessions. You can seek recommendations from family members, friends or colleagues that have previously sought life transition coaching services as they may direct you to some of the best coaches within your local area. You can also read through the online reviews posted on reliable websites as they will give you an insight into what to expect from the various life transition coaching services.

    What No One Knows About

    Where To Start with and More

  • What I Can Teach You About

    Comments Off on What I Can Teach You About
    May 29, 2020 /  Arts & Entertainment

    Tips For Buying Antique Porcelain

    Buying antique porcelain can be an overwhelming task. Nowadays, there are different platforms where a person can think about buying one. It is advisable to be careful while purchasing one. In order to choose the right antique porcelain, it is advisable to think about checking certain factors. One can consider purchasing antique porcelain from different kinds of websites. One should make sure to buy antique porcelain from a trusted and reliable website. Due to many websites offer a different kind of product it is always hard to check all of the imperfections but the conditions should always be stated in an accurate manner. It is recommended when one purchases antique porcelain on the website, it is advisable to call the site in order they can double-check if it has any kind of problems and it is the best condition.

    Most of the companies always ships to their clients through different shipping companies. They don’t charge any fee to ship the products of their clients. As you purchase antique porcelain, it is important to know they are a different type. There is dissimilar kind of the ultimate buyers, for instance, the decorators, speculators, museums, and collectors. The prices of the antique will be high especially if the kind of the buyer will want the worth of his money and with a high expectation that the museum will like to buy than eventually as a great investment. So that you can easily understand the market, it is important to always let off the existing price level but also have the idea of the demand where it is going to come from and also the availability of a given idea.

    It is advisable to check the origin as you generalize the art. You will find that some of them are made from the materials which have a strong universal interest while others depend on the purchasing power of the country from the origin country. For instance, the Japanese porcelain depends principally on the Japanese economy. Chinese porcelain is a particular kind that has a fairly worldwide interest and due to add request and ever falling supply in the west. As you buy an antique collector, it is the advice of collectors. The aesthetic appeal should always be able to determine price should not be a factor. For instance, the price should not be very exciting, it is always possible to buy the best antique. This means that the many pieces of the chine ceramics are not always superb.

    The aesthetic factors are at times outweighed by the county of origin and fashion. For instance, if you start by collecting or already one, it is always great to compare the possible price of buying based on the aesthetic appeal. This may lead to being more of the hectic mixture and however, the collection will always run along with no much of the pleasure. Buying the antique of the auction can be tricky kind 7of the business each of the countries has its own rules on how they conduct their business.

    The Key Elements of Great

    5 Takeaways That I Learned About

  • Why not learn more about ?

    Comments Off on Why not learn more about ?
    May 29, 2020 /  Relationships

    What You Need to Know Before You Buy a Labrador

    Choosing the right breed of dog can sometimes be difficult. This is because they are all types of breeds and each breed has its own pros and cons. However, in this article, you are going to learn about labradors. These types of dogs have amazing qualities and as a result, they are quite a common breed. Before you decide on which breed you want, it is important that you know a few things about them. This is so that, by the time you adopt one or even buy one, you have a clear picture of how they are and on how you can take good care of them. In the paragraphs below, you will learn briefly about what you must know before you bring a Labrador puppy into your home.

    They are Outgoing
    if you are looking for some fun in your life or for a buddy that you can play with, buying a Labrador puppy is the best decision that you can make. These types of dogs are known to be outgoing and extremely friendly. They love having fun and pleasing their masters whenever they can. It is because of their outgoing and friendly nature that they are often considered as the best family pets.

    They Need a Lot of Time
    Since Labradors are outgoing and sociable, it means that they require more attention compared to other breeds of dogs. Before you buy a Labrador puppy or before you adopt one, it is important for you to be sure that there is someone constantly taking care of them. So if you do not have the time to always be at home, it is good if you make a purchase once you are sure that you can provide them with the attention and the time that they need.

    Consider the Space That You Have
    Another factor that you need to consider is whether you have enough space in your home to bring up a Labrador puppy. when you buy them they will just be tiny cute puppies. However, as they continue to grow they become this beautiful large breed of dogs that are playful and sometimes clumsy. Hence, you must make sure that your home can accommodate a Labrador comfortably from the time they are still a puppy to when they are fully grown.

    Change Your Lifestyle
    Bringing a puppy home is like bringing a baby home. The minute you decide to own a Labrador puppy, there are a number of things in your life that will need to change. Your lifestyle must change for you to make sure that your dog is happy and well taken care of. So if you are the kind of person who constantly travels, owning a Labrador may not be the best decision that you can make. Since they need your attention you need to be in a position to provide them with that.

    Look Into the Budget
    Finally, consider the budget. The process of looking at the budget is something that you should start immediately. You will need to determine how much it is going to cost you to buy the puppy and how much it will cost you over the years to take care of it. Look at the numbers and if you cannot afford it, it’s better if you seek an alternative

    Why No One Talks About Anymore

    Smart Tips For Uncovering

  • Smart Ideas: Revisited

    Comments Off on Smart Ideas: Revisited
    May 29, 2020 /  Arts & Entertainment

    How to Choose the Best Sex Toys Store
    Despite them being very popular, sex toys are quite controversial. They are available at sex toys outlets in major and even small towns and cities. There are surprisingly many sex toy stores to choose from in the market. Read the article below to find out more about some factors that you should consider when buying toys from a sex toy store.
    You should take into account the cost of buying toys from a particular store. Before you make any purchases from a particular sex toy store it is advisable to find out the current market prices for the toys you want. Some stores might sell sex toys at abnormally high prices to unwitting consumers. You will avoid such sex toy stores if you know the current market prices for the toys you want. You should be wary of sex toy stores that sell their products at abnormally low prices. You might discover that the toys they are selling are fake which makes them very dangerous. You should buy toys from a reputable store that has all the toys you need at a reasonable price. A good sex toy store may also have a fast and privacy-sensitive website that does not collect sensitive information.
    Another thing you have to consider is a sex toy store’s reputation. Sex toy stores gain popularity among customers and other players in the market if they can provide high-quality toys consistently at an affordable cost. It is advisable to determine whether a sex toy store sources its toys legally and ethically. Sex toy manufacturers have to stick by certain rules and procedures formulated by the regulatory authorities. Certain materials and chemicals should never be used in the production of sex toys. Exposing yourself to these chemicals or materials can affect your health severely. The sex toys you buy must be approved by the relevant regulatory authorities.
    You also have to consider the proximity of the sex toy store to your area of residence. A sex toy store whose premises is nearby will deliver toys to you within a short period. Besides, there are also other concerns. As mentioned earlier, sex toys are very controversial. You need to buy them in privacy and have them delivered as soon as possible if it’s online. You can only do this if the sex toy shop is located close to your home.
    You also have to take into consideration the quality of customer service provided by a sex toy store. Sex toys are rather sensitive items. Sometimes they can be faulty and you might not get the ones you want. The moment you buy a sex toy, no store can allow you to return it. You might therefore, need attendants to give you as much information as possible about a sex toy.

    5 Uses For

    5 Uses For

  • A 10-Point Plan for (Without Being Overwhelmed)

    Comments Off on A 10-Point Plan for (Without Being Overwhelmed)
    May 28, 2020 /  Relationships

    What a Person Can Expect from Email Preview Services
    Email preview services has various uses for every individual. It can help with reviving associations, help with maintaining a strategic distance from emails that are terrible and sham and furthermore distinguish senders. There are also updates that are essential and tidbits that are new that email preview services will keep a person informed about.
    Email review services assists with fixing the affiliations that are lost. An individual can fundamentally look by name, state and city, for the circumstance that an individual knows it and the site will have the choice to find their email addresses. Additionally, the site will give an individual any data that can be found as the location of the individual and telephone number. A person should not worry anymore about losing their friendship connections that are precious. Keeping in contact is made in a way that is simple.
    From time to time an individual can get an email that is weird looking that an individual is not exactly certain who the sender is. By using the email preview services, an individual sorts in the email address being referred to. Then a person will be able to see who the sender is and any contact information about them. The results yield any data available like phone numbers, address, and the name of the sender. This can be a tool that is great against spam, users that are malicious and any other solicitations and emails that are not wanted.
    The features of updates on this site licenses clients to get information that is revived on what another person experienced. It likewise permits the customers to post their encounters that they have experienced or pose inquiries to the network site. For example, an individual recorded an email address that is suspicious. There is an account that is shy of posting the specific email address, the motivation behind why it is suspicious and the results of the activities. This is an element that is incredible for sharing data and gaining from the occurrences of others that are unpleasant.
    About now a person is probably curious about the types of services that are run. The sites yield viable results that are many for a little more when compared to calling information from the phone of a person. An individual can get a lot more data that an individual would ever get from the services of calling. The site also lets an individual review their results before charging a person. An individual can see all the data that is required about the email. Email preview services offers a deal that is great and protection for the users of email.
    z

    The Best Advice About I’ve Ever Written

    The Best Advice About I’ve Ever Written

  • The Beginner’s Guide to

    Comments Off on The Beginner’s Guide to
    May 28, 2020 /  Travel

    Points to Choosing An Authentic Computer and Phone Repair Company

    Phones and computers are important gargets in technology. While using a phone or computer, it is prone to damages here and there. To get your phone or computer repaired after a dysfunction, you should contact a computer and phone repair company. Due to an increase in phones and computers, there is also an increase in phone and computer repair companies. Due to the many companies in the websites claiming to offer the best repair services, you will find it challenging to find the best. Below are some elements you will require considering when hiring a phone and computer repair company.

    Start by checking on the level of experience of a computer and phone repair company. Go for a company that does not only have experience but relevant experience. A computer and phone repair company that has not registered success over all the operational years is not reliable. A company is known for unreliable services if it has employed inexperienced computer and phone specialists. These computer and phone specialists should have enough knowledge and experience in computer and phone maintenance services. A computer and phone repair company that has relevant experience is likely to have more clients coming for computer and phone repair and maintenance services.

    Consider the level of professionalism too. A professional computer and phone specialist is one who has been trained about all phone and computer operations . They should provide repair and maintenance services without encountering any challenge since they have been trained. Certificates from the computer and phone specialists will guide on whether to hire them or not. An inexperienced computer and phone specialist might lead to the poor performance of a computer or phone.

    Consider the order of organization from particular computer and phone repair company. A computer and phone repair company that has poor operation is likely to be disorganized. You can tell a company is organized by their level of intelligence and experience. If you find a company computer or phone specialist approaching your objectives well then the company is well organized.

    Lastly, consider a computer and phone repair company that can build a strong relationship and bond. Be honest and transparent enough when giving your computer or phone issues to help your computer or phone repair specialist understand the problem behind the computer or phone problem. Poor communication skills can lead to misunderstandings and fatal performance problems thereafter. For great performance phones and computers after repair and maintenance, you should build a strong relationship bond with your computer and phone repair company. Go for an accessible computer or phone repair specialist from the computer and phone repair company. You will also take back your computer or phone to the company in case of any problem.

    Finding Similarities Between and Life

    What You Should Know About This Year