Magento 1.9 wont update - sql

I have uploaded Better Blog onto my magento store, we created a couple of Blog posts using it, but when we went back in to make changes, like spelling mistakes, it wont update, even though the changes show within the editor?
I have tried flushing the magento Cache, even within the FTP var/cache

Related

Reload on Solr (Google Cloud)

I was having issues with submitting a document into Solr on Google Cloud and read somewhere that the issue should be resolved by committing.
I couldn't figure out how to commit on Solr(noob) and pressed a button called reload. The error went away, but I'm afraid I messed something else up. Can anyone explain what reload does compared to commit, or confirm if reload was fine?
No, reload isn't fine if you want a commit.
The reload command tells solr to update some core based on a new configuration (solrconfig, schema and another config files). Even if it work in your case, it's not meant for the purpose of commit.
The commit command tells solr that the data sent to it should be searchable ASAP. I guess it's what you're looking for.
For this you can configure automatic commits and/or soft commits in solrconfig.xml. There's also a URL you can call to achieve this, which is something like this: http://localhost:8983/solr/mycollection/update?commit=true
I recommend you to read this docs:
Commit
Reload

Checking in pending changes in TFS does not affect source code

I'm an extreme newbie to managing TFS, so please bear with me and know I'll need baby steps. I'll try to be as specific as possible.
I recently inherited an MVC ASP.net website written by a former colleague. Generally he would work directly in the production environment and commit changes as he went along. Obviously that's not good practice, so when I received it I decided to set it up in TFS along with a proper testing and development environment. I created the team project collection, added the existing solution to the collection, set up branching and branch hierarchy, and mapped the work environments. From what I can tell it's set up just like our other site that was configured in TFS before I came on (the person who set it up is long gone).
The issue I'm seeing now is that checking in changes don't seem to be affecting the actual code behind the site. Whether I make the changes in the test branch and then check-in/merge changeset with the production branch, or make the changes directly in production, saving and checking in changes doesn't actually affect the site. If I go into solution explorer and look at the files I just edited, my checked-in changes are not there. Same if I edit a web.config or something, I can then open in up in another text editor and my changes are nowhere to be found.
I followed Microsoft's instructions as closely as I could but clearly I missed something, I just have no idea what.

Composite C1 - develop locally, sync to live site

I have a couple of Composite C1 CMS websites.
To edit them currently I use the web based CMS on the live site.
However - I would like to update the (code & content) in Visual Studio locally - then sync to the web. However, if my local copy is older than that online (e.g. a non techy client has edited something on the live site) and I Web Deploy - it will go over the top of the new file on the server.
I need a solution that works out the newest change? I can't find anything in Google or the C1 docs.
How can I sync - preferably using Web Deploy. Do I need some kind of version control?
Is there a best practice for this - editing the live site through the web interface seems a bit dicey & is slow.
The general answer to this type of scenario seems to be to use the Package Creator. With that you can develop locally, add the files you've changed to a package, and install that package on a live site. This solution does not at all cover all the parts of you question though, and has certain limitations:
You cannot selectively add content to a package. It's all pages or no pages.
Adding datatypes is easy, but updating them later requires you to delete the datatype (and data), and recreate the datatype.
In my experience packages works well for incremental site updates, if you limit the packages content to be front end stuff, like css, images and such.
You say you need a solution that works out the newest changes - I believe the only solution to this is yourself, with the aid of some tooling. I don't think there's a silver bullet solution here.
Should you use a version control system? Yes! By all means. Even if you are not sharing your code with anyone, a VCS is a great way to get to know Composite C1 from a file system perspective, as you can carefully track what files are changed on disk, as you develop. This knowledge is crucial when you want to continuously add features the a website that is already alive and kicking - you need to know what to deploy, and what not to touch.
Make sure you read the docs on how Composite fits in VCS: http://docs.composite.net/Configuration/C1-and-Version-Control
I assume that your sites are using the XML data storage (if you where using SQL Data Store, your content would not be overridden upon sync).
This means that your entire web application lives in one folder on disk on the web server, which can be an advantage here.
I'll try to outline a solution that could work for you, although I must stress that I've never tried this - I'm making it up as I type.
Let's say you're using git, download the site in it's entirety from the production web server, and commit the whole damned thing* to your master branch.
Then you create a new feature branch from that commit, and start making the changes you want to deploy later, and carefully commit your work as you go along, making sure you only commit the changes that are needed for your feature to work, to the feature branch.
Now, you are ready to deploy, and you switch back the master branch, and again download the entire site and commit it to master.
You then merge your feature branch into the master branch, and have git do all the hard work of stitching you changes in with the changes from the live site. There are bound to be merge conflicts, and that is where you will have to jump in, and decide for yourself what content needs to go live.
After this is done and tested, you can web deploy the site up to the production environment.
Changes to the live site might have occurred while you where merging, so consider closing the site, or parts of it, during this process.
If you are using SQL Data Store i suggest paying for a tool like Red Gate's SQL Compare and SQL Data Compare or SQL Delta, to compare your dev database to the production database, and hand pick SQL scripts that can be applied to the production database along with your feature deployment.
'* Do consider using a .gitignore file to avoid committing certain files - refer to the docs for mere info.
I suppose you should use the Package Creator
Also have a look here: http://docs.composite.net/Configuration/C1-and-Version-Control

Data changes in RavenDB by itself

I have set up a RavenDB for evaluation. I wrote some code which pushed some documents into it. I then have a web site which renders those documents.
Throughout the day, I used the Raven Studio to modify some text in those documents, so that I could see the changes come through in my web site.
Problem: It seems that after going home for the night, when I come in the next day my database has changed - my documents have reverted to the 'pre-changed' versions... what's going on??
I've looked through the Raven console output, and there were no update commands issued on my developer machine overnight (nor would I expect there to be!!)
Note: this is just running on my development machine.
As far as I know, RavenDB has no code in it that would automatically undo commited write operations and honestly, this would really scare me. Altogether this sounds really weird and I can't think of a scenario where that could actually happen. I suggest you send the logfiles to ravendb support if it happens again, because this would be a really serious issue.
My colleague had this very problem with updates being reverted. The update we made was to add a property, and then also a document specific value for this property, to all the documents. We called SaveConfiguration() and saw the change being done in the Raven Studio. A while later some of the documents had lost it's new property.
I decided to turn on the logging and therefore added an NLog.config file, to get the logging started I touched the web.config. This of course restarted the application, and "voila", the updates appeared in the Raven Studio again.
After a while they disappeared from the Raven Studio, so I assumed that this was a studio problem. I therefore tried to retrieve the objects from the database in a test controller, unfortunately the objects were lacking the property value here too, so it wasn't just a studio problem.
With the logging turned on we updated the documents of the specific type again, and according to the logs and also the studio we actually updated the documents. Not long thereafter the documents reverted by losing it's added property yet again (my colleague started crying at this point - true story)..
Later I came to realize that this was all because of our live web application still had the old version of the object. When it was read in the web application the data was returned without the extra property. Because of this it seems like our DocumentSession thought that the object had changed (in all fairness), so when we called SaveChanges even these objects was written to the database - without it's extra property.
Is my conclusion correct? What is the solution to this problem? I'm thinking CQRS, because then we will never call "SaveChanges()" on the DocumentSession for reads.
Adam,
Just making sure, did you call SaveChanges() after you made your modifications?
There is absolutely nothing in RavenDB that would cause this behavior.

Creating a test site for updating a CMS

I have been asked by a client to make amends to their site using the custom CS system that was built for them (by somebody else). Making the changes is not a problem but they want the changed to be viewed on a test server before going live and the only way I can think of doing that is by pulling the entire site down, duplicating and reconnecting databases and uploading it to a test server. Then I would have to make all the changes twice which isn't really ideal.
Does anyone know of a way to do this that isn't such a ball ache? There's hundreds of files and data tables as you would expect with a custom CMS and for changes that would only take a few hours to do duplicating the entire site seems a tad unnecessary.
Cheers,
Sam
Does the CMS have "preview mode"?
Typically, in a CMS you make your changes using the content editing interface, save the changes, allow authorized users to view the changes in preview mode, and then change the status to "approved"; this then sends the changes live.
Different products call this by a different name, and have different ways of doing it - but it's worth rooting around in the custom CMS to see if there's something similar.