Has Dreamweaver CS5 been deleting local files instead of remote files? - file

As scary as it sounds.
I select a remote file, I hit delete. Instead of deleting the selected remote file, Dreamweaver chooses to delete the currently selected LOCAL file(s) - even if they're unrelated.
EDIT: I am not able to 'deselect' the soon-to-be-deleted LOCAL file, not that I should have to. Didn't work this way in CS3.
The only way I caught this at all, is because DW tried to delete a file (LOCAL file in screenshot - to the right of the image) which had site links. Thank dog it had site links. Shame about all of my other work which I haven't found yet.
Screenshot attached, describing the steps I took. Also to note, I did search prefs and site prefs etc to see if there was a 'Wreak Havoc Randomly Upon Deletion' checkbox - there was nothing of the sort.
Adobe hasn't answered my for weeks now, am about to lodge a request but figured I'd start here instead of going through that ordeal.
Thanks for any help/thoughts.
Screenshot is quite large - dual monitors capturing all of the prob in situ. http://codefinger.co.nz/projects/fpx/mod/xeditor/dw_cs5_localFileDeletion_lame.gif

I have come to the conclusion that 'BACKSPACE' deletes remote files, and the 'DEL' key deletes local files. I too figured this out the hard way, and deleted some important files. I haven't seen anything in the new features on why this was changed like this, but it may have been nice to let people know of the change.

Adobe is aware of this bug and will be addressing it in there next update. There is a simple work around where you don't have to change views.
Select the remote file, files or folder that you want to work with.
Click the refresh button at the top of the files panel.
Select the remote file, files or folders that you want to work with a second time.
Right click and choose "Delete".

getting this too. Quick workaround. If you have dreamweaver show just the remote site the delete action works as it should (ie it deletes the remote file). I'm not keen on this view but for now it'll be a case of using it to delete remote stuff. Hope this is fixed soon - do any of you know if there's an 'alert me' function on an Adobe bug report? Cheers Andy

Related

Best way to transfer database to new website without access to previous one

I has old website, which I miss dashbored login info, so I didn't has access to its database
Now I implement new one, what the best way to extract previous data (mp3 files, articals, mp4 files) to use it in new one.
I hope there is a way to extract its not manually
Thanks
I'd suggest in future all extraneous matter, images, mp3s etc. be held in a separate folder on your server, and hyperlinked to where they appear on a page.
That folder can be downloaded for safe-keeping.
.
In this case, I can only suggest that if the website is still up ( since you've simply lost the keys ), as an ordinary user you could download the whole website [ which wouldn't include the database ] with something like wget or Httrack to your computer, and from that extract what items are possible to your computer [ filtering by extensions if need be ].
At the very least you should get an idea of which items were present, even if they don't download.
As for articles, either the website online, or the downloaded website should display them in a browser, and you could copy the text.
None of this applies if the website is no longer online; but I wish you well whatever the outcome.

File explorer first instance hangs, second runs OK

Very often, when I open File Explorer (on Windows 10) it "hangs", showing empty panels with a search bar on the top slowly moving to the right, with nothing else happening. However, if I open a second (or third, etc....) instance of the same file explorer, it immmediately works showing all the files (while the first keeps on "searching" with no results). If I close everything and re-open the Explorer, once again the first instance hangs on this sort of search activity, while the second works OK.
Unfortunately this behaviour is a bit random (see the "very often" above). I have tried closing any other program that might interfere, but couldn't find anything. I also couldn't find much help by googling it. My guess is that by using the PC at some point I run some program that corrupts something ...
Any hint on where to look?
I just ran into this problem for the umpteenth time, and I definitely found the fix for this iteration. After trying several things, this was the instantaneous fix for the slow scanning bar and slow column sorting:
Navigate to the problem folder in the Window Explorer folder view.
Right-click the folder and select "Properties".
In Properties, select the "Customize" tab.
Make sure "Optimize this folder for:" is set to "General Items" AND "Also apply this template to all subfolders" is checked.
Click OK.
This immediately brought my problem folders to life and everything worked normally. I would recommend doing this at the Drive level folder, and then go back and customize folder such as Pictures, Music, and Videos. The reference that solved the issue is quite old, but still seems to be an issue (and the fix).
Before this I had tried rebooting several times, cleaning out shortcuts, sfc, dsim, and several other things that did not have any effect. Now opening any folder is lightning fast with no scan and sorting by name, type, size, or date modified is just as quick.

"Respawning" Files in Cpanel

So straight to the point- Im trying to clean my host entirely (databases too) and after I delete the last 2 files wp-content and wp-includes (700MB of files) they get restored instantly. This may be a simple question but for me it s very odd and I don`t get it. Besides file-manager i used Filezilla too and the same thing happens(my hosting company as it su#%$ failed to give me a reply after 48h).
I have recorded a short video of my problem to help you better understand my issue.
https://www.youtube.com/watch?v=wqL35R0-vvw&feature=youtu.be
Hope you`ll be able to help me. Thank You !
I`m working on this website for an NGO after it was hacked and for now I want to wipe every single file from the server and rebuild it but those files which have inside infected pages(php scripts) wont get deleted
Chances are very good some of those files are owned by the webserver, especially if you were compromised via a WordPress vulnerability. As they're owned by the webserver and not your user, you're unable to delete them.
If you have root/sudo access, you can use that on the command-line to remove them. If you don't, you'll need your host to help.

Dealing with server stranded file uploads

I have an Angular SPA app running on Azure and I'd like to implement a rich text editor similar to Medium.com. I'll be using some existing editor for this, but I have a problem with image files.
The problem
I would like my editor to have the ability to also add images inside content. The problem I'm having is when should I be uploading images to the server?
Possible solution 1: Immediate upload after they're selected
The good
saving content is quicker because all images are likely already uploaded
files get displayed right after they're uploaded from the server URL
The bad
files may have to be deleted on the server if users selects to cancel editing
files may get stranded on the server if user simply closes their browser window
Possible solution 2: Upload after save
The good
files get displayed immediately using FileAPI capabilities
no stranded server side files if editing is discarded in whatever way
The bad
saving of content may take longer as all images need to be uploaded at the moment of saving content
additional client-side to display images from local files
Question
I would like to implement Solution 1 because it provides more transparent user interface process and reacts quicker to edit saves => better UX. But how should I be managing stranded files? I could use a worker process that would delete stranded files from time to time, but I wonder whether that's the best approach for this scenario.
What and how would you suggest I implement this?
This is highly subjective (opinion based), but I will give it a shot.
You are actually having bigger problem than you think. In your abstracted approaches, you only describe a situation when user started something as new. Whereas I see much harder to solve issues if user is editing existing item. What will happen if he/she deletes images, adds new images and at the end hits CANCEL. And also, if Internet connection drops while creating / editing?
I would also go for Solution one. And, of course minimize the "bad" things, as they aren't really that much or hard to handle. Here is how I would solve all the "bad"s in Approach 1:
All my articles (or whatever user is editing with editor) will have a boolean flag "IsDraft" or something like this. Then all my business logic for front end will only look for items where IsDraft == False.
Whenever a user starts a new article (the easiest to solve problem) I immediately create new item in my DB with IsDraft=True
Have a link table to keep link between ID of item being created and Image Files being used (blobs). The point here is, that, if you do not keep links between used and unused blobs, you will have a lot headaches deciding which blob to delete and which one to leave on the storage.
Have a worker process (either as worker process in Web Role if I use Cloud Services, or as a Web Job (+ nice and short explanation here) is I use Web Sites) that checks for articles that are Draft and older than XXX days. If found - delete files + article itself.
Handling Editing of existing item is more challenging - for this, I might take the approach of:
Create a new copy of the entire article when user hits Edit and mark it as draft
If user hits save - switch the content of the new article (new version) with the existing one, leaving the new article marked as IsDraft - the worker process will clean it up.
If user doesn't hit save for some reason (hit cancel, or internet drops, or computer restarts, or browser crashes, or or or ..) - the new article will be cleaned later by the worker process
And if you want to go deeper and crazier, you can have a section in your admin panel where you show the Drafts to your users, so they can either continue work, or leave it to be auto cleaned.

File doesn't show up in TYPO3 Fileadmin

is TYPO3 doing some indexing of the filesystem into database?
We try to add a video to our page here, but video isn't selecable from the file window altough its put into the right directory.
The Fileadmin says "9 records found" in the folder and is displaying only 7 Files, because someone might have deleted two of the files out of the folder.
But the new video file we've put there won't be displayed, too.
Is there any way to manually start TYPO3s file indexer?
I don't think TYPO3 does any indexing of the files; it just reads the list of the files straight off the filesystem. So there's no file indexer to start.
My suggestion for your problem would be to check the permissions on the file you uploaded and make sure it's readable by whatever user the Web server is running as ('apache', 'www-data', etc). If it isn't readable by the Web server it won't show up in the fileadmin area.
Okay guys, I feel stupid right now - altough I can't really explain the behavior...
What happened seemes to be some strange caching failure.
If you click on the "choose file" button it shows the window as seen in the screenshot above, displaying the last used folder.
But: this view is somehow cached.
If you select the same folder again from the file tree from the left, the view is updated and the missing files are shown -.-

Resources