ADVFN Logo ADVFN

We could not find any results for:
Make sure your spelling is correct or try broadening your search.

Trending Now

Toplists

It looks like you aren't logged in.
Click the button below to log in and view your recent history.

Hot Features

Registration Strip Icon for charts Register for streaming realtime charts, analysis tools, and prices.

WAND Wandisco Plc

63.60
0.00 (0.00%)
Last Updated: 01:00:00
Delayed by 15 minutes
Share Name Share Symbol Market Type Share ISIN Share Description
Wandisco Plc LSE:WAND London Ordinary Share JE00B6Y3DV84 ORD 10P
  Price Change % Change Share Price Bid Price Offer Price High Price Low Price Open Price Shares Traded Last Trade
  0.00 0.00% 63.60 63.80 65.20 - 0.00 01:00:00
Industry Sector Turnover Profit EPS - Basic PE Ratio Market Cap
0 0 N/A 0

Wandisco Share Discussion Threads

Showing 4401 to 4423 of 6575 messages
Chat Pages: Latest  179  178  177  176  175  174  173  172  171  170  169  168  Older
DateSubjectAuthorDiscuss
20/2/2019
13:30
All of the cos going for it, Azure and AWS the most likely are US so assuming fusion is part of it (I'd imagine if AWS wins it wand would need to be in its highest partnership tier which it now it - I didn't realise it wasn't!) they'll be a relatively small component revenue-wise against the total spend.
tickboo
20/2/2019
12:18
Under the Trump administration a U K listed company would have to offer something special to get a large government contract.
jackdaw4243
19/2/2019
12:05
Although Edison's note had revenue a little down at least it had mms bookings higher and an okay % for fusion which clearly needs to accelerate this year. Good news they're closer to AWS and at the highest rating with them which would be needed if they win the JEDI US gov contract.
tickboo
19/2/2019
11:22
Yeah that's sly and I think reflects poorly on them. Board expectations back at the interim were aligned with market forecasts but now they're not and I guess due to the new Edison note. Poor either as you look at it. Agree that it looks better with the cash raised and the good start to the year but to really build momentum we need some RNSs with significant deals.
tickboo
19/2/2019
11:18
In the RNS .. “board forecast had been achieved”.. which is frankly a silly thing to say.. and differs greatly from “we will be in line with market expectations “ which is what they said at the half year..


That said .. new balance sheet , new year ..

Spin the wheel.. big news coming.

knighttokingprawn
19/2/2019
10:51
Tick

Where did they say that "forecast,s had been achieved"

jackdaw4243
19/2/2019
10:21
I'm not surprised with the fall as profit is profit. I'm am hoping we get some positive RNSs in the coming weeks. The improved balance sheet may have been needed to close some big deals. We'll see. I think it's a bit cheeky them saying board forecasts were achieved when they clearly weren't. Anyway, hopefully Is being dotted and Ts being crossed on some deals.
tickboo
18/2/2019
15:09
Healthcare firms go for the hybrid cloud approach with compliance and connectivity key.It continues to be a hybrid cloud-dominated landscape – and according to new research one of the traditionally toughest industries in terms of cloud adoption is now seeing it as a priority.A report from enterprise cloud provider Nutanix has found that in two years' time, more than a third (37%) of healthcare organisations polled said they would deploy hybrid cloud. This represents a major increase from less than a fifth (19%) today.The study, which polled more than 2,300 IT decision makers, including 345 global healthcare organisations, found more than a quarter (28%) of respondents saw security and compliance as the number one factor in choosing where to run workloads. It's not entirely surprising. All data can be seen as equal, but healthcare is certainly an industry where the data which comes from it is more equal than others. Factor in compliance initiatives, particularly HIPAA, and it's clear to see how vital the security message is.Yet another key area is around IT spending. The survey found healthcare organisations were around 40% over budget when it came to public cloud spend, compared to a 35% average for other industries. Organisations polled who currently use public cloud spend around a quarter (26%) of their annual IT budget on it – a number which is expected to rise to 35% in two years.Healthcare firms see ERP and CRM, analytics, containers and IoT – the latter being an evident one for connected medical devices – as important use cases for public cloud. The average penetration in healthcare is just above the global score. 88% of those polled said they see hybrid cloud to positively impact their businesses – yet skills are a major issue, behind only AI and machine learning as an area where healthcare firms are struggling for talent.It is certainly an area where the largest vendors have been targeting in recent months. Amazon Web Services (AWS) announced in September a partnership with Accenture and Merck to build a cloud-based informatics research platform aiming to help life sciences organisations explore drug development. Google took the opportunity at healthcare conference HiMSS to launch a new cloud healthcare API, focusing on data types such as HL7, FHIR and DICOM.Naturally, Nutanix is also in the business of helping healthcare organisations with their cloud migrations. Yet increased maturity across the industry will make for interesting reading. The healthcare IT stack of the future will require different workloads in different areas, with connectivity the key. More than half of those polled said 'inter-cloud application mobility' was essential going forward."Healthcare organisations especially need the flexibility, ease of management and security that the cloud delivers, and this need will only become more prominent as attacks on systems become more advanced, compliance regulations more stringent, and data storage needs more demanding," said Chris Kozup, Nutanix SVP of global marketing. "As our findings predict, healthcare organisations are bullish on hybrid cloud growth for their core applications and will continue to see it as the ideal solution as we usher in the next era of healthcare."With the cloud giving way to new technologies and tools such as machine learning and automation, we expect to see positive changes leading to better healthcare solutions in the long run," Kozup added.
tickboo
18/2/2019
10:11
If they release big contract news it is a lot more likely. Here's hoping.
tickboo
18/2/2019
09:15
My prediction of 850p last thursday fulfilled itself rather quicker than I expected.
The area around 850 is rather nebulous. It is not beyond credibility that 1200p could be achieved just as suddenly!

horneblower
15/2/2019
20:36
See The "Books " profit and loss and salaries.
jackdaw4243
15/2/2019
18:22
On the Microsoft website,Great to see wand being pushed and clearly working closely with Azure. IBM seems to be getting closer with their joint engineering as well and AWS. Looking good as they're reliant on the partners selling fusion. James Baker Program Manager, Azure StorageOn February 7, 2019 we announced the general availability of Azure Data Lake Storage (ADLS) Gen2. Azure is now the only cloud provider to offer a no-compromise cloud storage solution that is fast, secure, massively scalable, cost-effective, and fully capable of running the most demanding production workloads. In this blog post we'll take a closer look at the technical foundation of ADLS that will power the end to end analytics scenarios our customers demand.ADLS is the only cloud storage service that is purpose-built for big data analytics. It is designed to integrate with a broad range of analytics frameworks enabling a true enterprise data lake, maximize performance via true filesystem semantics, scales to meet the needs of the most demanding analytics workloads, is priced at cloud object storage rates, and is flexible to support a broad range of workloads so that you are not required to create silos for your data.A foundational part of the platformThe Azure Analytics Platform not only features a great data lake for storing your data with ADLS, but is rich with additional services and a vibrant ecosystem that allows you to succeed with your end to end analytics pipelines.Azure features services such as HDInsight and Azure Databricks for processing data, Azure Data Factory to ingress and orchestrate, Azure SQL Data Warehouse, Azure Analysis Services, and Power BI to consume your data in a pattern known as the Modern Data Warehouse, allowing you to maximize the benefit of your enterprise data lake.End to end analytics graphAdditionally, an ecosystem of popular analytics tools and frameworks integrate with ADLS so that you can build the solution that meets your needs."Data management and data governance is top of mind for customers implementing cloud analytics solutions. The Azure Data Lake Storage Gen2 team have been fantastic partners ensuring tight integration to provide a best-in-class customer experience as our customers adopt ADLS Gen2."– Ronen Schwartz, Senior Vice president & General Manager of Data Integration and Cloud Integration, Informatica"WANDisco's Fusion data replication technology combined with Azure Data Lake Storage Gen2 provides our customers a compelling LiveData solution for hybrid analytics by enabling easy access to Azure Data Services without imposing any downtime or disruption to on premise operations."– David Richards, Co-Founder and CEO, WANdisco"Microsoft continues to innovate in providing scalable, secure infrastructure which go hand in hand with Cloudera's mission of delivering on the Enterprise Data Cloud. We are very pleased to see Azure Data Lake Storage Gen2 roll out globally. Our mutual customers can take advantage of the simplicity of administration this storage option provides when combined with our analytics platform."– Vikram Makhija, General Manager for Cloud, ClouderaPerformancePerformance is the number one driver of value for big data analytics workloads. The reason for this is simple, the more performant the storage layer, the less compute (the expensive part!) required to extract the value from your data. Therefore, not only do you gain a competitive advantage by achieving insights sooner, you do so at a significantly reduced cost."We saw a 40 percent performance improvement and a significant reduction of our storage footprint after testing one of our market risk analytics workflows at Zurich's Investment Management on Azure Data Lake Storage Gen2."– Valerio Bürker, Program Manager Investment Information Solutions, Zurich InsuranceLet's look at how ADLS achieves overwhelming performance. The most notable feature is the Hierarchical Namespace (HNS) that allows this massively scalable storage service to arrange your data like a filesystem with a hierarchy of directories. All analytics frameworks (eg. Spark, Hive, etc.) are built with an implicit assumption that the underlying storage service is a hierarchical filesystem. This is most obvious when data is written to temporary directories which are renamed at the completion of the job. For traditional cloud-based object stores, this is an O(n) complex operation, n copies and deletes, that dramatically impacts performance. In ADLS this rename is a single atomic metadata operation.Azure Data Lake Storage diagram.jpgThe other contributor to performance is the Azure Blob Filesystem (ABFS) driver. This driver takes advantage of the fact that the ADLS endpoint is optimized for big data analytics workloads. These workloads are most sensitive to maximizing throughput via large IO operations, as distinct from other general purpose cloud stores that must optimize for a much larger range of IO operations. This level of optimization leads to significant IO performance improvements that directly benefits the performance and cost aspects of running big data analytics workloads on Azure. The ABFS driver is contributed as part of Apache Hadoop® and is available in HDInsight and Azure Databricks, as well as other commercial Hadoop distributions.ScalableScalability for big data analytics is also critically important. There's no point having a solution that works great for a few TBs of data, but collapses as the data size inevitably grows. The rate of growth of big data analytics projects tend to be non-linear as a consequence of more diverse and accessible sources of data. Most projects do benefit from the principle that the more data you have, the better the insights. However, this leads to design challenges such that the system must scale at the same rate as the growth of the data. One of the great design pivots of big data analytics frameworks, such as Hadoop and Spark, is that they scale horizontally. What this means is that as the data and/or processing grows, you can just add more nodes to your cluster and the processing continues unabated. This, however, relies on the storage layer scaling linearly as well.This is where the value of building ADLS on top of the existing Azure Blob service shines. The EB scale of this service now applies to ADLS ensuring that no limits exist on the amount of data to be stored or accessed. In practical terms, customers can store 100s of PB of data which can be accessed with throughput to satisfy the most demanding workloads.ADLS Gen2 Architecture diagramSecureFor customers wanting to build a data lake to serve the entire enterprise, security is no lightweight consideration. There are multiple aspects to providing end to end security for your data lake:Authentication – Azure Active Directory OAuth bearer tokens provide industry standard authentication mechanisms, backed by the same identity service used throughout Azure and Office365.Access control – A combination of Azure Role Based Access Control (RBAC) and POSIX-compliant Access Control Lists (ACLs) to provide flexible and scalable access control. Significantly, the POSIX ACLs are the same mechanism used within Hadoop.Encryption at rest and transit – Data stored in ADLS is encrypted using either a system supplied or customer managed key. Additionally, data is encrypted using TLS 1.2 whilst in transit.Network transport security – Given that ADLS exposes endpoints on the public Internet, transport-level protections are provided via Storage Firewalls that securely restrict where the data may be accessed from, enforced at the packet level.Tight integration with analytics frameworks results in an end to end secure pipeline. The HDInsight Enterprise Security Package makes end-user authentication flow through the cluster and to the data in the data lake.Get started today!
tickboo
15/2/2019
17:47
Indeed, growth and in the black rather than red. Good to see the IIs involved who are no mugs but the jam tomorrow won't be palatable for much longer. The accounts will be a mess as Edison's note suggests, wand expect to have $10.7m cash so have clearly burnt an awful lot in '18. I assume they are genuine when they say the start to the year has been good looking st their recruitment and pipeline (so they say).
tickboo
15/2/2019
17:33
Tick

"properly accelerate growth" I am sure you mean with profit, there has to be a return some place for the outlay. There are companies paying a 5% divided along with capital appreciation. Let's see the accounts , we have seen the smoke.

jackdaw4243
15/2/2019
17:05
Thanks and hopefully this really is the last cash call. Hopefully they'll properly accelerate growth.
tickboo
15/2/2019
16:54
Your position is looking a lot healthier now tickboo, well done.
owenski
15/2/2019
14:48
IBM EMBRACES MULTICLOUD WAREHOUSE AVAILABILITY ON AWS, ADDS ELASTIC SMPThe focus on customer needs for greater choice and flexibility is a constant at the IBM Think 2019 conference. Nowhere is this more evident than in IBM Hybrid Data Management, which supports data of any type, source and structure, be it on-premises or in the cloud.On Monday, the level of customer choice expanded even further with two key announcements, including IBM Db2 Warehouse on Cloud, now available on Amazon Web Services (AWS), and the introduction of a cost optimized, symmetric multiprocessing (SMP) elastic scaling configuration. Both announcements are great news for organizations seeking to take advantage of multicloud environments and dynamically scale resources to meet their exact data needs.DIGGING DEEPERMulticloud environments are more than just a trend. They're how the majority of businesses are choosing to operate. A recent survey conducted by the IBM Institute for Business Value found that 85 percent of companies are "operating in a multicloud environment today." That's consistent with IDC's prediction for 2021 that "over 90 percent of enterprises will use multiple cloud services and platforms."Full Source: IBM DataHub
tickboo
15/2/2019
10:49
Wand blog tweeted -Andrea: Can you please share why WANdisco is so excited about the ADLS Gen2 GA announcement?Paul: The general availability of ADLS Gen2 is a significant announcement from Microsoft, and being able to take advantage of it without downtime or disruption will be important for every organization.The benefits that come from bringing your large-scale data sets to ADLS Gen2 are huge, and Microsoft recognizes the importance of a strong ecosystem of partners to help make this happen. WANdisco is excited to be leading the way with solutions for hybrid architectures and migration strategies for data at any scale. WANdisco solutions eliminate the risk of disruption to business applications, and are compatible with the big data technologies that enterprises are using today. We make it easy to adopt ADLS Gen2.Andrea: Why is WANdisco such an important Microsoft partner for Azure customers?Paul: As Microsoft customers know, ADLS is designed to support petabyte-scale analytics workloads with massive throughput, both of which are key capabilities for enterprise data lakes. Because ADLS Gen2 offers the familiar benefits of ADLS Gen1-such as file system semantics, structured security and scale-and the performance of Azure Blob Storage, customers can boost the cost-efficiency, performance and scale of their analytics workloads substantially by migrating from ADLS Gen1 to ADLS Gen2.Realizing these benefits requires customers to migrate in a way that doesn't disrupt the critical analytics workloads already running in ADLS Gen1-and we know that's not a trivial problem.The WANdisco Fusion platform helps Azure customers avoid the data consistency challenges of migrating large, fast-changing data sets, and that's why WANdisco is a stand-out partner for Microsoft.Andrea: Can you give us some examples of these data consistency challenges, and how WANdisco Fusion solves them?Paul: Enterprise IT teams face the reality that it's not possible to move petabytes of data overnight: you have to do it over time, without stopping applications that depend on that data. While this migration is happening, you also need to continually update your cloud data to reflect any changes made on-premises. From that vantage point, the question of data consistency becomes crucial, and businesses need to ask themselves how to achieve this while replicating their changing data.If you have analytics applications in the cloud and on-premises that need to access the same data, strong-not eventual-consistency is critical to ensure that all your users are working with the same information.There are vendors that offer eventual consistency via change data capture tools, which is effectively a form of transaction log replay. But only WANdisco Fusion's distributed coordination engine (DConE) enforces consistency by coordinating activities performed by big data applications against their data. Without this approach, it is impossible to avoid conflicting changes being made to the same data in different locations (e.g. from application working against ADLS Gen1 and other applications using ADLS Gen2). Reconciling conflicts between data at scale can be essentially impossible, so avoiding them in the first place is critical to solving the challenges that come with data migration.At scale, data replication for hybrid architectures must address multiple levels of information-kind of like a layer cake. WANdisco Fusion works over these layers.Andrea: Thank you - that's an interesting metaphor. So if the first layer of the cake is the data itself, what's the next one?Paul: The next layer of information is your metadata, and it's just as important as the data itself. A good example of this is security metadata: the policies and permissions that you apply to your data to control and limit access. If the security of your data is important, you need to keep this type of metadata consistent in exactly the same way, rather than leaving your data exposed while you attempt to rebuild those policies later. Aside from the huge amount of work that would involve, it's a prime risk area that could leave you with some very serious exposure.If we think specifically about migrating data from ADLS Gen1 to ADLS Gen2, this is an important consideration.Crucially, ADLS Gen1 customers who want to use ADLS Gen2 can't afford the cost of rebuilding security policies for their data-so WANdisco Fusion really is a key enabler for these types of migration projects.Andrea: What are the other layers in the data replication model, and why are they important to consider in hybrid cloud operations?Paul: There are multiple layers of metadata-a technology like Apache Hive is a good example. Hive allows you to apply structure to your data for analytics purposes. This metadata lets applications query big data without the need to transform its structure first. WANdisco Fusion can replicate this metadata so that hybrid architectures can take advantage of information at scale using standard analytic toolsets across multiple environments.The final layer consists of the big data applications themselves. Bringing applications to a new storage platform can be risky if you've taken a big-bang approach and cut everything over at the same time, because there's no way to fall back if something goes wrong. Having the ability to eliminate that risk by being able to test applications individually and over an extended period with a hybrid architecture is going to be critical. With WANdisco Fusion, this strategy becomes simple, and can account for failures either with individual applications, or even entire clusters.Andrea: So WANdisco Fusion also provides DR for analytics workloads?Paul: That's right-as you might expect, the strong data consistency that WANdisco Fusion provides also makes it a natural choice to deliver high-availability and disaster-recovery capabilities in the Azure cloud. Pranav Rastogi, Program Manager, Azure Big Data, Microsoft, touches on these capabilities in a blog, which you can read here.That's right - as you might expect, the strong data consistency that WANdisco Fusion provides also makes it a natural choice to deliver high-availability and disaster-recovery capabilities in the Azure cloud.Andrea: If any Microsoft customer wants to learn more about using WANdisco Fusion to support a simple, disruption-free migration from ADLS Gen1 to ADLS Gen2, what should they do next?Paul: WANdisco has been delivering solutions for migrating big data environments for many years. WANdisco Fusion is a natural fit for customers wanting to do the same for ADLS Gen2. If you want to dig into some more of the technical details, our associates on the Azure ADLS team have published a terrific blog here, or you can watch our WANdisco Fusion demo here. Or if you're ready to explore how the solution works in the real world, click here to learn how AMD uses WANdisco Fusion to protect critical business data against disaster.
tickboo
15/2/2019
09:58
Think the numbers are definitely skewed to upside and it could be quite Digital in nature , I don’t see this as been a slow ramp with the “strategic customer” ..obviously it now needs to happen.. but Msft for example do seem to be getting behind the solution and despite concerns about revenue we do have validation of a broad swathe of use cases.. it is Jam tomorrow, but I hear nothing but good things from the tech industry about Wand product .. so it could be a big pot of jam
knighttokingprawn
15/2/2019
09:49
Also from the note -As highlighted in our last update note, there could be upside to FY19 estimates if a strategic deal is secured
tickboo
15/2/2019
09:06
Tick

Think strategic deal should follow in quick order ... additionally underpinned now by the AWS agreement... we have motored a bit here so probably need to consolidate .. but think we should get the pop corn out and enjoy the show forbthe next few months ...

knighttokingprawn
14/2/2019
17:05
Horneblower

TNX you are correct, I bought some Standard Charter yesterday and as my broker put through the buy it came up om my screen as a sell.

Having a good day just cashed in my AZN.

jackdaw4243
14/2/2019
16:35
Thanks and a pity you didn't action that buy. Interesting what tomorrow brings. I keep banging on about the strategic deal mentioned in Edison's note and I can only imagine a stronger balance sheet and recruiting mainly engineers will strengthen wand's position. Good end to the day and here's hoping for a blue day tomorrow too.
tickboo
Chat Pages: Latest  179  178  177  176  175  174  173  172  171  170  169  168  Older

Your Recent History

Delayed Upgrade Clock