Jump to content

Mikael Kalms

Members
  • Content count

    79
  • Joined

  • Last visited

  • Days Won

    10

Mikael Kalms last won the day on March 17 2017

Mikael Kalms had the most liked content!

Community Reputation

1 Neutral

About Mikael Kalms

  • Rank
    Advanced Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Enable
  1. Mikael Kalms

    How to verify workspace integrity?

    Thanks. I just got access to the machine. The file did indeed have different content locally. It also had identical "modified" and "created" timestamps, from back-then when the Unity editor initially had created the file uniquely on that machine. I can't explain the process which led to this situation, but there was a local difference on the machine which 'cm status . --all' and 'cm update . --all' didn't recognize. We deleted the file manually and updated to get a good version of the file. Since it seems there is no option to check all local content (ignoring metadata - rather, validating that metadata is not inconsistent) we'd resort to a more blunt method in this kind of situation in the future (after we know for sure the problem is file content mismatch between machines): delete all files in workspace, then update to get good versions onto the machine.
  2. Mikael Kalms

    How to verify workspace integrity?

    Hi, we have a situation on one machine where I believe (95% certain, will be able to check tomorrow) that there is a file in the workspace, whose contents does not match the repository version. The client is probably running Plastic 7.x and have "Check content (hash) when the file timestamp is modified to set it as "Changed"" enabled. The client is probably not running Plastic Change Tracker. It is possible that the file has an identical timestamp to what's in the repository. It is possible that the file has an identical size to what's in the repository. I will make some more detailed checks tomorrow. 1. Is there a command which I can run which will make the Plastic GUI or CM perform a full validation of the on-disk workspace contents, regardless of timestamps? 2. If the timestamp (and perhaps also size) is identical, will Plastic GUI / CM assume that the file's content hasn't changed, without actually looking at the bytes within?
  3. Hi, You are correct in that the left-hand pane should not be empty. I experienced it again just now - no message about "encoding changed", but the left-hand pane was empty. Restarting Plastic showed correct content in the left-hand pane. I have sent client logs & a screenshot to you via your ticketing system: Request #14551.
  4. No repro steps, sorry. It happens randomly.
  5. Hi, I have noticed on a couple of occasions (2? 3?) over the past month that the Diff view in the bottom half of the Pending changes tab failed to display a diff. Today, for 4 out of 5 source files, the Diff view guessed that the left pane's encoding was "None" [incorrect], and the right pane's encoding was "UTF-8" [correct]: Manually changing the "Left encoding" setting to "UTF-8" made the "Encoding changed" message disappear. However, the Diff view did not refresh itself fully: the diff result was the same (a blank left pane). Switching between different files in the list of changed items resulted in the Diff view re-identifying encoding, again incorrectly. Selecting a file and choosing "Diff workspace contents (Ctrl+D)" opened up a new tab. That tab detected encoding correctly (UTF-8 vs UTF-8) and that diff view showed correctly. Shutting down and restarting the Plastic SCM client resolved the problem: the Diff view in Pending changes is working correctly for me now. Plastic SCM client version: 7.0.16.2175
  6. Hi, Status update for how the Blue Ocean plugin is working: The last month's Plastic plugin fixes have improved the display a lot. Thanks for that! Remaining problem 1: the "commit" column is blank in Blue Ocean's Pipeline view (probably core Jenkins bug, https://issues.jenkins-ci.org/browse/JENKINS-46521) Remaining problem 2: Blue Ocean's Pipeline view sometimes doesn't display associated changeset info (probably also a core Jenkins bug, can't find any JIRA on it though). Details on that below... The classic Pipeline view is accurate and displays one change per changeset (I am using light-weight checkout, which avoids another bug in Jenkins where changes are listed twice in the classic Pipeline view): However, for some of these jobs, the Blue Ocean pipeline view will show "Started by an SCM change" as message. This message is shown when there is no new commit paired with the build job: Now if I go in and look at details for a specific job - in this case, job 1662 - then it will show correct change information for that job (so that view has changeset information associated with it):
  7. One positive note: When using Light-weight Checkout, Jenkins will no longer pick up changes twice in its classic Pipeline view: (Blue Ocean's Pipeline only counts changes once, both with with full and light-weight checkout.)
  8. Initial tests with latest Plastic SCM client are good; Lightweight checkout works in a toy example + for our main product.
  9. This may or may not be relevant: Some source code comparison yields that in the Plastic SCM plugin, the older code in Workspaces.java uses the Hudson API function FilePath.createTextTempFile(). It places files in locations like "/var/lib/jenkins/workspace/Tests/TestLightWeightCheckout@script/PongSP/selector7941601814195684624.txt". However the newer code in PlasticSCMFile.java uses the native Java API function File.createTempFile() which will place files in the system's default temp folder. There might be permissions problems with accessing the system-default temp folder when Jenkins runs things. In any case, it would be cleaner if the PlasticSCMFile code also kept its temp files in the Jenkins-designated temp file area.
  10. Hi, I'm testing the Lightweight checkout feature. I have created a new Pipeline project (it is more or less a duplicate of an existing project) with "Pipeline from SCM". When I try to build with Lightweight checkout off, it works, but with Lightweight checkout on, I get the following error messages in the Jenkins log: /var/lib/jenkins$ tail -n 12 /var/log/jenkins/jenkins.log Apr 20, 2018 8:24:29 PM com.codicesoftware.plugins.hudson.PlasticTool execute WARNING: The cm command 'checkselectorsyntax' failed. Retrying after 500 ms... (1) Apr 20, 2018 8:24:30 PM com.codicesoftware.plugins.hudson.PlasticTool execute WARNING: The cm command 'checkselectorsyntax' failed. Retrying after 500 ms... (2) Apr 20, 2018 8:24:31 PM com.codicesoftware.plugins.hudson.PlasticTool execute WARNING: The cm command 'checkselectorsyntax' failed. Retrying after 500 ms... (3) Apr 20, 2018 8:24:31 PM com.codicesoftware.plugins.jenkins.PlasticSCMFile getRepObjectSpecFromSelector SEVERE: null Apr 20, 2018 8:24:31 PM org.jenkinsci.plugins.workflow.job.WorkflowRun finish INFO: Tests/TestLightWeightCheckout #4 completed: FAILURE Apr 20, 2018 8:24:31 PM org.jenkinsci.plugins.workflow.flow.FlowExecutionList unregister WARNING: Owner[Tests/TestLightWeightCheckout/4:Tests/TestLightWeightCheckout #4] was not in the list to begin with: [] The Plastic SCM plugin then throws this error: https://github.com/jenkinsci/plasticscm-plugin/blob/931952b1b7903dc3fce22bc7cb392317d90e8e2d/src/main/java/com/codicesoftware/plugins/jenkins/PlasticSCMFile.java#L111 And the 'cm' version: /var/lib/jenkins$ cm version 7.0.16.2143 Now, I don't know how to intercept and capture the exact CM command and its result. Enabling debugging at level 'debug' for cm.exe according to the article only gives me these corresponding messages for one of those 'cm checkselectorsyntax' invocations: 2018-04-20 20:35:00,772 INFO 1 cm - STARTING CLIENT 2018-04-20 20:35:00,971 DEBUG 1 ClientConfig - Time l2018-04-20 20:35:01,046 DEBUG 1 ClientConfig - Time loading client.conf (/var/lib/jenkins/.plastic4/client.conf) 238 ms 2018-04-20 20:35:01,086 DEBUG 1 cm - IsOutputRedirected: [Redirected] Here is what the configuration looks like in the Jenkins project configuration page: Selector: repository "PongSP@<organization>@Cloud" path "/" smartbranch "/main" Use update: yes Workspace name: PongSP Script Path: PongSP/Jenkinsfile.Windows OS version: /var/lib/jenkins$ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 16.04.4 LTS Release: 16.04 Codename: xenial
  11. After more testing, it turns out that: - I had the Plastic Change Tracker installed, but the Plastic GUI client was not configured to use it - These problems happen with the regular "move detection" logic in Plastic, but not when the Change Tracker is active (because the Change Tracker will listen to filesystem-level operations, and see the filesystem-level move) - These problems do not happen when using the 'cm mv ...' console command, as that performs a move + logs the move operation with Plastic at the same time Lessons for me: - Ensure I and colleagues have the Change Tracker installed and active (following instructions in the 6.0.16.1151 release notes document) - When problems arise, use 'cm mv ...' to perform the moves
  12. Mikael Kalms

    Replication of projects with Xlinks

    Subtree branches sounds intriguing. It will be interesting to see how that works in practice
  13. Mikael Kalms

    GitSync does not like replicated Plastic repos

    Codice Software have confirmed that this is a limitation of the current GitSync implementation. We have a problematic situation because of all these factors: 1. We are using Plastic Cloud and not Plastic Server 2. We have open source software as part of our main product 3. We have the open source portions in separate Plastic Cloud repos 4. We have xlinks from the main product Plastic Cloud repo to the open source Plastic Cloud repos 5. We use GitSync to sync between local Plastic repos and GitHub repos 6. We have more than one developer who wants to perform GitSync on (replicated versions of) the same Plastic repo 1+2+3+4+5+6 = we run into the above problem. What we are going to do is, we will move to a GitHub-centric workflow for the open source software: Developers will work directly in Git, or, if they want to, they can create local Plastic repos and use GitSync to be able to work on the project using Plastic. These local Plastic repos will not be replicated anywhere, particularly not to Plastic Cloud. They can be considered throw-away repos; the source of truth will be the GitHub repo. We will also stop using Xlinks. Instead, we will update the opensource projects within our product by copy & pasting in files. We will lose the detailed versioning in that step, but we avoid all the complexity that Xlinks bring.
  14. Mikael Kalms

    Selecting/deselecting many directories with Gluon

    Hi Wolfram, just to give you some ideas on how you could work: We are also working on a Unity3d project. 12 devs (6x art, 3x code, 3x design) We keep source-format assets on Google drive, and only game-ready content (fbx tga and so on) + game source code in Plastic. The latest working copy is ~5GB in size for us. Everyone on our team uses the full Plastic GUI client (not Gluon). We chose the full client because it allows working with branches. Most people started out working directly on main, then I trained them to work on personal branches, then I trained them to work on task-based branches. Today 70% use task based branches, and 30% use personal branches. The Branch Explorer is a good enough graphical view for all disciplines to understand this way of working. We do not use exclusive checkout. We do not use the Unity/Plastic integration. We force meta files visible and force text format for all assets. We pay lots of attention to making our data files small and mergeable (including extensive use of prefabs just to move stuff out of scene files). We avoid situations where we need to merge scene files through communication & coordination. This works very well for us so far. In the future, if Plastic's roadmap allows for quicker check-ins of large datafiles & "nodata replica" workflows improve, we may consider moving our Google Drive content to Plastic (either into the main game repo, or into a parallel repo).
  15. Hi, I recently tried establishing a workflow where we publish one of our projects on GitHub. We would do the primary development in a repo in Plastic Cloud. The idea was that developers would replicate the Plastic repo to their local workstations, develop there, and then run GitSync to update the public version of the project. However -- when using GitSync, Plastic will store the GUID of the first Plastic repo used in the Plastic repo's metadata. This information will then be uploaded to Cloud at the next push, and fetched by other developers when they pull. When the other developers attempt to use GitSync, they will get an error message: The sync cannot start because the target repository was replicated from a repository synchronized with git. The repository originally synchronized is 'GitHubTest1@local - https://github.com/Kalmalyzer/PlasticGitHubTest.git'. Please contact support for further info. This means that only the first developer who used GitSync can perform the sync. Worse yet: if the developer deletes his repo and recreates it and replicates from Cloud, noone on the team can use GitSync any more. Am I missing something vital here?
×